Saturday 10 October 2015
Sunday 27 September 2015
v BIG WHITE WALL
A safe online community of people who are anxious, down or not coping who support and help each other by sharing what’s troubling them, guided by trained professionals.
Available 24/7, Big White Wall is completely anonymous so you can express yourself freely and openly. Professionally trained Wall Guides ensure the safety and anonymity of all members.
Watch this quick 2 minute video to find out how Big White Wall works.
Big White Wall is available free in many areas of the UK via the NHS, employers, and universities. It is also free to all UK serving personnel, veterans, and their families.
A safe online community of people who are anxious, down or not coping who support and help each other by sharing what’s troubling them, guided by trained professionals.
Available 24/7, Big White Wall is completely anonymous so you can express yourself freely and openly. Professionally trained Wall Guides ensure the safety and anonymity of all members.
Watch this quick 2 minute video to find out how Big White Wall works.
Big White Wall is available free in many areas of the UK via the NHS, employers, and universities. It is also free to all UK serving personnel, veterans, and their families.
Thursday 17 September 2015
1918 Flu Pandemic
The influenza or flu pandemic of 1918 to 1919, the deadliest in modern history, infected an estimated 500 million people worldwide–about one-third of the planet’s population at the time–and killed an estimated 20 million to 50 million victims. More than 25 percent of the U.S. population became sick, and some 675,000 Americans died during the pandemic. The 1918 flu was first observed in Europe, the U.S. and parts of Asia before swiftly spreading around the world. Surprisingly, many flu victims were young, otherwise healthy adults. At the time, there were no effective drugs or vaccines to treat this killer flu strain or prevent its spread. In the U.S., citizens were ordered to wear masks, and schools, theaters and other public places were shuttered. Researchers later discovered what made the 1918 pandemic so deadly: In many victims, the influenza virus had invaded their lungs and caused pneumonia.
The influenza or flu pandemic of 1918 to 1919, the deadliest in modern history, infected an estimated 500 million people worldwide–about one-third of the planet’s population at the time–and killed an estimated 20 million to 50 million victims. More than 25 percent of the U.S. population became sick, and some 675,000 Americans died during the pandemic. The 1918 flu was first observed in Europe, the U.S. and parts of Asia before swiftly spreading around the world. Surprisingly, many flu victims were young, otherwise healthy adults. At the time, there were no effective drugs or vaccines to treat this killer flu strain or prevent its spread. In the U.S., citizens were ordered to wear masks, and schools, theaters and other public places were shuttered. Researchers later discovered what made the 1918 pandemic so deadly: In many victims, the influenza virus had invaded their lungs and caused pneumonia.
Flu Facts
Influenza, or flu, is a virus that attacks the respiratory system. The flu virus is highly contagious: When an infected person coughs, sneezes or talks, respiratory droplets are generated and transmitted into the air, and can then can be inhaled by anyone nearby. Additionally, a person who touches something with the virus on it and then touches his or her mouth, eyes or nose can become infected.
Flu outbreaks happen every year and vary in severity, depending in part on what type of virus is spreading. (Flu viruses, which are divided into three broad categories, can rapidly mutate.) In the U.S., “flu season” generally runs from late fall into spring. In a typical year, more than 200,000 Americans are hospitalized for flu-related complications, and over the past three decades, there have been some 3,000 to 49,000 flu-related deaths in the U.S. annually, according to the Centers for Disease Control and Prevention. Young children, people over age 65, pregnant women and people with certain medical conditions, such as asthma, diabetes or heart disease, face a higher risk of flu-related complications, including pneumonia, ear and sinus infections and bronchitis. A flu pandemic, such as the one in 1918, occurs when an especially virulent new influenza strain for which there’s little or no immunity appears and spreads quickly from person-to-person around the globe.
Flu outbreaks happen every year and vary in severity, depending in part on what type of virus is spreading. (Flu viruses, which are divided into three broad categories, can rapidly mutate.) In the U.S., “flu season” generally runs from late fall into spring. In a typical year, more than 200,000 Americans are hospitalized for flu-related complications, and over the past three decades, there have been some 3,000 to 49,000 flu-related deaths in the U.S. annually, according to the Centers for Disease Control and Prevention. Young children, people over age 65, pregnant women and people with certain medical conditions, such as asthma, diabetes or heart disease, face a higher risk of flu-related complications, including pneumonia, ear and sinus infections and bronchitis. A flu pandemic, such as the one in 1918, occurs when an especially virulent new influenza strain for which there’s little or no immunity appears and spreads quickly from person-to-person around the globe.
The Flu Strikes Far and Wide
The first wave of the 1918 pandemic occurred in the spring and was generally mild. The sick, who experienced such typical flu symptoms as chills, fever and fatigue, usually recovered after several days, and the number of reported deaths was low. However, a second, highly contagious wave of influenza appeared with a vengeance in the fall of that same year. Victims died within hours or days of their symptoms appearing, their skin turning blue and their lungs filling with fluid that caused them to suffocate. In just one year, 1918, the average life expectancy in America plummeted by a dozen years.
It’s unknown exactly where the particular strain of influenza that caused the pandemic came from; however, the 1918 flu was first observed in Europe, America and areas of Asia before spreading to almost every other part of the planet within a matter of months. Despite the fact that the 1918 flu wasn’t isolated to one place, it became known around the world as the Spanish flu, as Spain was one of the earliest countries to be hit hard by the disease. Even Spain’s king, Alfonso XIII (1886-1931), contracted the flu.
One unusual aspect of the 1918 flu was that it struck down many previously healthy, young people–a group normally resistant to this type of infectious illness–including a number of World War I (1914-18) servicemen. In fact, journalist Gina Kolata has reported that more U.S. soldiers died from the 1918 flu than were killed in battle during the war. Forty percent of the U.S. Navy was hit with the flu, while 36 percent of the Army became ill, notes Kolata in her book on the subject.
Although the death toll attributed to the 1918 flu is often estimated at 20 million to 50 million victims worldwide, other estimates run as high as 100 million victims. The exact numbers are impossible to know due to a lack of medical record-keeping in many places. What is known, however, is that few locations were immune to the 1918 flu–in America, victims ranged from residents of major cities to those of remote Alaskan communities. Even President Woodrow Wilson (1856-1924) reportedly contracted the flu in early 1919 while negotiating the Treaty of Versailles, which ended World War I.
It’s unknown exactly where the particular strain of influenza that caused the pandemic came from; however, the 1918 flu was first observed in Europe, America and areas of Asia before spreading to almost every other part of the planet within a matter of months. Despite the fact that the 1918 flu wasn’t isolated to one place, it became known around the world as the Spanish flu, as Spain was one of the earliest countries to be hit hard by the disease. Even Spain’s king, Alfonso XIII (1886-1931), contracted the flu.
One unusual aspect of the 1918 flu was that it struck down many previously healthy, young people–a group normally resistant to this type of infectious illness–including a number of World War I (1914-18) servicemen. In fact, journalist Gina Kolata has reported that more U.S. soldiers died from the 1918 flu than were killed in battle during the war. Forty percent of the U.S. Navy was hit with the flu, while 36 percent of the Army became ill, notes Kolata in her book on the subject.
Although the death toll attributed to the 1918 flu is often estimated at 20 million to 50 million victims worldwide, other estimates run as high as 100 million victims. The exact numbers are impossible to know due to a lack of medical record-keeping in many places. What is known, however, is that few locations were immune to the 1918 flu–in America, victims ranged from residents of major cities to those of remote Alaskan communities. Even President Woodrow Wilson (1856-1924) reportedly contracted the flu in early 1919 while negotiating the Treaty of Versailles, which ended World War I.
Fighting the Flu
When the 1918 flu hit, doctors and scientists were unsure what caused it or how to treat it. Unlike today, there were no effective vaccines or antivirals, drugs that treat the flu. (The first licensed flu vaccine appeared in America in the 1940s; by the following decade, vaccine manufacturers could routinely produce vaccines that would help control and prevent future pandemics, according to the U.S. Department of Health and Human Services.)
Complicating matters was the fact that World War I had left parts of America with a shortage of physicians and other health workers. And of the available medical personnel in the U.S., many came down with the flu themselves. Additionally, hospitals in some areas were so overloaded with flu patients that schools, private homes and other buildings had to be converted into makeshift hospitals, some of which were staffed by medical students.
Officials in some communities imposed quarantines, ordered citizens to wear masks and shut down public places, including schools, churches and theaters. People were advised to avoid shaking hands and to stay indoors, libraries put a halt on lending books and regulations were passed banning spitting. According to an April 30, 2009, report in The New York Times, during the pandemic, Boy Scouts in New York City approached people they’d seen spitting on the street and gave them cards that read: “You are in violation of the Sanitary Code.”
Complicating matters was the fact that World War I had left parts of America with a shortage of physicians and other health workers. And of the available medical personnel in the U.S., many came down with the flu themselves. Additionally, hospitals in some areas were so overloaded with flu patients that schools, private homes and other buildings had to be converted into makeshift hospitals, some of which were staffed by medical students.
Officials in some communities imposed quarantines, ordered citizens to wear masks and shut down public places, including schools, churches and theaters. People were advised to avoid shaking hands and to stay indoors, libraries put a halt on lending books and regulations were passed banning spitting. According to an April 30, 2009, report in The New York Times, during the pandemic, Boy Scouts in New York City approached people they’d seen spitting on the street and gave them cards that read: “You are in violation of the Sanitary Code.”
The Flu Takes Heavy Toll on Society
The flu took a heavy human toll, wiping out entire families and leaving countless widows and orphans in its wake. Funeral parlors were overwhelmed and bodies piled up. Some people even had to dig graves for their own family members.
The flu was also detrimental to the economy. In the U.S., businesses were forced to shut down because so many employees were sick. Basic services such as mail delivery and garbage collection were hindered due to flu-stricken workers. In some places there weren’t enough farm workers to harvest crops. Even state and local health departments closed for business, hampering efforts to chronicle the spread of the 1918 flu and provide the public with answers about it.
The flu was also detrimental to the economy. In the U.S., businesses were forced to shut down because so many employees were sick. Basic services such as mail delivery and garbage collection were hindered due to flu-stricken workers. In some places there weren’t enough farm workers to harvest crops. Even state and local health departments closed for business, hampering efforts to chronicle the spread of the 1918 flu and provide the public with answers about it.
Flu Pandemic Comes to an End
By the summer of 1919, the flu pandemic came to an end, as those that were infected either died or developed immunity. Almost 90 years later, in 2008, researchers announced they’d discovered what made the 1918 flu so deadly: A group of three genes enabled the virus to weaken a victim’s bronchial tubes and lungs and clear the way for bacterial pneumonia.
Since 1918, there have been several other influenza pandemics, although none as deadly. A flu pandemic from 1957 to 1958 killed around 2 million people worldwide, including some 70,000 people in the U.S., and a pandemic from 1968 to 1969 killed approximately 1 million people, including some 34,000 Americans. More than 12,000 Americans perished during the H1N1 (or “swine flu”) pandemic that occurred from 2009 to 2010.
Since 1918, there have been several other influenza pandemics, although none as deadly. A flu pandemic from 1957 to 1958 killed around 2 million people worldwide, including some 70,000 people in the U.S., and a pandemic from 1968 to 1969 killed approximately 1 million people, including some 34,000 Americans. More than 12,000 Americans perished during the H1N1 (or “swine flu”) pandemic that occurred from 2009 to 2010.
Battle of Vimy Ridge
This World War I skirmish in 1917 marked the first time that the Allies’ four Canadian divisions attacked together as the Canadian Corps. The corps launched their offensive at Vimy on Easter Sunday, and within three days had eradicated the German defenses. This swift victory was achieved through excellent artillery preparation and effective infantry tactics, and was boosted by the Canadian advantage in personnel that allowed them to endure a higher casualty count. Part of the larger Battle of Arras, the capture of Vimy Ridge helped establish the Canadian Corps as a premier fighting force.
On Easter Sunday, April 9, at 5:30 a.m., the Canadian Corps swept forward in a sleet storm and took nearly all its objectives on schedule that day. Over the next three days, the last German defenses on the left were captured. This swift victory was achieved primarily through an excellent artillery preparation and creeping barrage, but also good infantry training and execution, effective infantry tactics (“leaning on the barrage”), poor German defensive plans, the sleet as cover, and the use of underground caves and tunnels contributed to the success. The Canadian Corps also outnumbered the defenders by 35,000 to 10,000, and, with flank support, deployed 1,130 guns. This latter concentration was more than double the density used at the Somme.
Canadian casualties amounted to around 10,500, whereas the German defenders lost almost all their strength, including 4,000 prisoners. The capture of Vimy Ridge pointed to the future, where similar careful set-piece attacks in 1917 and 1918 established the Canadian Corps as a premier fighting force.
The Reader’s Companion to Military History. Edited by Robert Cowley and Geoffrey Parker. Copyright © 1996 by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
This World War I skirmish in 1917 marked the first time that the Allies’ four Canadian divisions attacked together as the Canadian Corps. The corps launched their offensive at Vimy on Easter Sunday, and within three days had eradicated the German defenses. This swift victory was achieved through excellent artillery preparation and effective infantry tactics, and was boosted by the Canadian advantage in personnel that allowed them to endure a higher casualty count. Part of the larger Battle of Arras, the capture of Vimy Ridge helped establish the Canadian Corps as a premier fighting force.
For the first time in World War I, the four Canadian divisions attacked together as the Canadian Corps, at Vimy in northern France. Some historians have seen this as a pivotal moment in the development of a Canadian identity. Vimy Ridge had defied previous attacks by the Allies, but in early 1917 its capture formed part of a larger battle, supporting a British attack at Arras, which itself assisted a major French offensive.
Canadian casualties amounted to around 10,500, whereas the German defenders lost almost all their strength, including 4,000 prisoners. The capture of Vimy Ridge pointed to the future, where similar careful set-piece attacks in 1917 and 1918 established the Canadian Corps as a premier fighting force.
The Reader’s Companion to Military History. Edited by Robert Cowley and Geoffrey Parker. Copyright © 1996 by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
The history of India begins with evidence of human activity Anatomically modern humans, as long as 75,000 years ago, or with earlier hominids including Homo erectus from about 500,000 years ago.[1]
The Indus Valley Civilization which spread and flourished in the northwestern part of the Indian subcontinent from c. 3300 to 1300 BCE in present-day Pakistan and northwest India, was the first major civilization in South Asia.[2] A sophisticated and technologically advanced urban culture developed in the Mature Harappan period, from 2600 to 1900 BCE.[3] This civilization collapsed at the start of the second millennium BCE and was later followed by the Iron Age Vedic Civilization, which extended over much of the Indo-Gangetic plain and which witnessed the rise of major polities known as the Mahajanapadas. In one of these kingdoms, Magadha, Mahavira and Gautama Buddha propagated their Shramanic philosophies during the fifth and sixth century BCE.
Most of the subcontinent was conquered by the Maurya Empire during the 4th and 3rd centuries BCE. From the 3rd century BC onwards Prakrit and Pali literature in the north and the Sangam literature in southern India started to flourish.[4][5] The famous Wootz steel originated in south India in the 3rd century BC and was exported to foreign countries.[6][7][8]
Various parts of India were ruled by numerous Middle kingdoms for the next 1,500 years, among which the Gupta Empire stands out. This period, witnessing a Hindu religious and intellectual resurgence, is known as the classical or "Golden Age of India". During this period, aspects of Indian civilization, administration, culture, and religion (Hinduism and Buddhism) spread to much of Asia, while kingdoms in southern India had maritime business links with the Roman Empire from around 77 CE. Indian cultural influence spread over many parts of Southeast Asia which led to the establishment of Indianized kingdoms in Southeast Asia (Greater India).[9]
The most significant event between the 7th and 11th century was the Tripartite struggle centered on Kannauj that lasted for more than two centuries between the Pala Empire, Rashtrakuta Empire, and Gurjara Pratihara Empire. Southern India was ruled by the Chalukya, Chola, Pallava, Pandyan, and Western Chalukya Empires. The seventh century also saw the advent of Islam as a political power, though as a fringe, in the western part of the subcontinent, in modern-day Pakistan.[10] The Chola dynasty conquered southern India and successfully invaded parts of Southeast Asia and Sri Lanka in the 11th century.[11][12] The early medieval period Indian mathematics influenced the development of mathematics and astronomy in the Arab world and the Hindu numerals were introduced.[13]
Muslim rule started in parts of north India in the 13th century when the Delhi Sultanate was founded in 1206 CE by the central Asian Turks.[14] The Delhi Sultanate ruled the major part of northern India in the early 14th century, but declined in the late 14th century when several powerful Hindu states such as the Vijayanagara Empire, Gajapati Kingdom, Ahom Kingdom and the Mewar dynasty emerged. In the 16th century, Mughals came from Central Asia and gradually covered most of India. The Mughal Empire suffered a gradual decline in the early 18th century, which provided opportunities for the Maratha Empire, Sikh Empire and Mysore Kingdom to exercise control over large areas of the subcontinent.[15][16]
From the late 18th century to mid-19th century, large areas of India were annexed by the British East India Company. Dissatisfaction with Company rule led to the Indian Rebellion of 1857, after which the British provinces of India were directly administered by the British Crown and witnessed a period of both rapid development of infrastructure and economic stagnation. During the first half of the 20th century, a nationwide struggle for independence was launched with the leading party involved being the Indian National Congress which was later joined by other organizations.
The subcontinent gained independence from the United Kingdom in 1947, after the British provinces were partitioned into the dominions of India and Pakistan and the princely states all acceded to one of the new states.
Wednesday 16 September 2015
Hockey is a family of sports in which two teams play against each other by trying to maneuver a ball or a puck into the opponent's goal using a hockey stick. In many areas, one sport (typically field hockey or ice hockey[1]) is generally referred to simply as hockey
Cricket
Cricket is a bat-and-ball game played between two teams of 11 players each on a field at the centre of which is a rectangular 22-yard-long pitch. The game is played by 120 million players in many countries, making it the world's second most popular sport.[1][2][3] Each team takes its turn to bat, attempting to score runs, while the other team fields. Each turn is known as an innings (used for both singular and plural).
The bowler delivers the ball to the batsman who attempts to hit the ball with his bat away from the fielders so he can run to the other end of the pitch and score a run. Each batsman continues batting until he is out. The batting team continues batting until ten batsmen are out, or a specified number of overs of six balls have been bowled, at which point the teams switch roles and the fielding team comes in to bat.
In professional cricket, the length of a game ranges from 20 overs per side to Test cricket played over five days. The Laws of Cricket are maintained by the International Cricket Council (ICC) and the Marylebone Cricket Club (MCC) with additional Standard Playing Conditions for Test matches and One Day Internationals.[4]
Cricket was first played in southern England in or before the 16th century, though there is evidence that a version of the game was played in the French village of Liettres as far back as 1478.[5] By the end of the 18th century, it had developed to be the national sport of England. The expansion of the British Empire led to cricket being played overseas and by the mid-19th century the first international match was held. ICC, the game's governing body, has 10 full members.[6] The game is most popular in Australasia, England, the Indian subcontinent, the West Indies and Southern Africa
Cricket is a bat-and-ball game played between two teams of 11 players each on a field at the centre of which is a rectangular 22-yard-long pitch. The game is played by 120 million players in many countries, making it the world's second most popular sport.[1][2][3] Each team takes its turn to bat, attempting to score runs, while the other team fields. Each turn is known as an innings (used for both singular and plural).
The bowler delivers the ball to the batsman who attempts to hit the ball with his bat away from the fielders so he can run to the other end of the pitch and score a run. Each batsman continues batting until he is out. The batting team continues batting until ten batsmen are out, or a specified number of overs of six balls have been bowled, at which point the teams switch roles and the fielding team comes in to bat.
In professional cricket, the length of a game ranges from 20 overs per side to Test cricket played over five days. The Laws of Cricket are maintained by the International Cricket Council (ICC) and the Marylebone Cricket Club (MCC) with additional Standard Playing Conditions for Test matches and One Day Internationals.[4]
Cricket was first played in southern England in or before the 16th century, though there is evidence that a version of the game was played in the French village of Liettres as far back as 1478.[5] By the end of the 18th century, it had developed to be the national sport of England. The expansion of the British Empire led to cricket being played overseas and by the mid-19th century the first international match was held. ICC, the game's governing body, has 10 full members.[6] The game is most popular in Australasia, England, the Indian subcontinent, the West Indies and Southern Africa
Football refers to a number of sports that involve, to varying degrees, kicking a ball with the foot to score a goal. Unqualified, the word football is understood to refer to whichever form of football is the most popular in the regional context in which the word appears: association football (known as soccer in some countries) in the United Kingdom; gridiron football (specifically American football or Canadian football) in the United States and Canada; Australian rules football or rugby league in different areas of Australia; Gaelic football in Ireland; and rugby football (specifically rugby union) in New Zealand.[1][2] These different variations of football are known as football codes.
Various forms of football can be identified in history, often as popular peasant games. Contemporary codes of football can be traced back to the codification of these games at English public schools in the eighteenth and nineteenth centuries.[3][4] The expanse of the British Empire allowed these rules of football to spread to areas of British influence outside of the directly controlled Empire,[5] though by the end of the nineteenth century, distinct regional codes were already developing: Gaelic football, for example, deliberately incorporated the rules of local traditional football games in order to maintain their heritage.[6] In 1888, The Football League was founded in England, becoming the first of many professional football competitions. During the twentieth century, several of the various kinds of football grew to become some of the most popular team sports in the world.[
Various forms of football can be identified in history, often as popular peasant games. Contemporary codes of football can be traced back to the codification of these games at English public schools in the eighteenth and nineteenth centuries.[3][4] The expanse of the British Empire allowed these rules of football to spread to areas of British influence outside of the directly controlled Empire,[5] though by the end of the nineteenth century, distinct regional codes were already developing: Gaelic football, for example, deliberately incorporated the rules of local traditional football games in order to maintain their heritage.[6] In 1888, The Football League was founded in England, becoming the first of many professional football competitions. During the twentieth century, several of the various kinds of football grew to become some of the most popular team sports in the world.[
Sachin Tendulkar
Sachin Tendulkar (born 24 April 1973) is a former Indian cricketer and captain, widely regarded to be one of the greatest cricketers of all time and by many as the greatest batsman of all time.[4][5][6][7] He took up cricket at the age of eleven, made his Test debut on 15 November 1989 against Pakistan in Karachi at the age of sixteen, and went on to represent Mumbai domestically and India internationally for close to twenty-four years. He is the only player to have scored one hundred international centuries, the first batsman to score a double century in a One Day International, holds the record for most number of runs in both ODI and Test cricket, the only player to complete more than 30,000 runs in international cricket.[8]
In 2002 just half way through his career, Wisden Cricketers' Almanack ranked him the second greatest Test batsman of all time, behind Don Bradman, and the second greatest ODI batsman of all time, behind Viv Richards.[9] Later in his career, Tendulkar was a part of the Indian team that won the 2011 World Cup, his first win in six World Cup appearances for India.[10] He had previously been named "Player of the Tournament" at the 2003 edition of the tournament, held in South Africa. In 2013, he was the only Indian cricketer included in an all-time Test World XI named to mark the 150th anniversary of Wisden Cricketers' Almanack.[11][12][13]
Tendulkar received the Arjuna Award in 1994 for his outstanding sporting achievement, the Rajiv Gandhi Khel Ratna award in 1997, India's highest sporting honour, and the Padma Shri and Padma Vibhushan awards in 1999 and 2008, respectively, India's fourth and second highest civilian awards.[14] After a few hours of his final match on 16 November 2013, the Prime Minister's Office announced the decision to award him the Bharat Ratna, India's highest civilian award.[15][16] He is the youngest recipient to date and the first ever sportsperson to receive the award.[17][18] He also won the 2010 Sir Garfield Sobers Trophy for cricketer of the year at the ICC awards.[19] In 2012, Tendulkar was nominated to the Rajya Sabha, the upper house of the Parliament of India.[20] He was also the first sportsperson and the first person without an aviation background to be awarded the honorary rank of group captain by the Indian Air Force.[21] In 2012, he was named an Honorary Member of the Order of Australia.[22][23]
In December 2012, Tendulkar announced his retirement from ODIs.[24] He retired from Twenty20 cricket in October 2013[25] and subsequently announced his retirement from all forms of cricket,[26][27] retiring on 16 November 2013 after playing his 200th and final Test match, against the West Indies in Mumbai's Wankhede Stadium.[28] Tendulkar played 664 international cricket matches in total, scoring 34,357 runs.[
Cristiano Ronaldo dos Santos Aveiro GOIH (born 5 February 1985), known as Cristiano Ronaldo (Portuguese pronunciation: [kɾɨʃtiˈɐnu ʁuˈnaɫdu]), is a Portuguese professional footballer who plays for Spanish club Real Madrid and the Portugal national team. He is a forward and serves as captain for Portugal. By the age of 22, Ronaldo had received Ballon d'Or and FIFA World Player of the Year nominations. The following year, in 2008, he won his first Ballon d'Or and FIFA World Player of the Year awards. He followed this up by winning the FIFA Ballon d'Or in 2013 and 2014. In January 2014, Ronaldo scored his 400th senior career goal for club and country aged 28.[4]
Often ranked as the best player in the world[5] and rated by some in the sport as the greatest of all time,[6][7][8][9] Ronaldo is the first Portuguese footballer to win three FIFA/Ballons d'Or, and the first player to win four European Golden Shoe awards. In January 2015, Ronaldo was named the best Portuguese player of all time by the Portuguese Football Federation, during its 100th anniversary celebrations.[10][11] With Manchester United and Real Madrid, he has won three Premier Leagues, one La Liga, one FA Cup, two Football League Cups, two Copas del Rey, one FA Community Shield, one Supercopas de España, two UEFA Champions League, one UEFA Super Cup and two FIFA Club World Cups.
Ronaldo began his career as a youth player for Andorinha, where he played for two years, before moving to C.D. Nacional. In 1997, he moved to Sporting CP. In 2003 he signed for Manchester United for £12.2 million (€15 million). In 2004, he won his first trophy, the FA Cup. In 2007 and 2008, Ronaldo was named FWA Footballer of the Year, and was named the 2008 FIFA World Player of the Year. In 2009 he won the FIFA Puskás Award for Goal of the Year. He became the world's most expensive player when he moved from Manchester United to Real Madrid in 2009 in a transfer worth £80 million (€94 million/$132 million). His buyout clause is valued at €1 billion.[12] In May 2012, he became the first footballer to score against every team in a single season in La Liga.[13] Ronaldo holds the record for most goals scored in a single UEFA Champions League season, having scored 17 goals in the 2013–14 season.[14] In December 2014, Ronaldo became the fastest player to score 200 goals in La Liga, which he accomplished in his 178th La Liga game.[15] He is the only player in the history of football to score 50 or more goals in a season on five consecutive occasions.[16] In September 2015 he became the all-time top goalscorer in the UEFA Champions League with 80 goals.
Ronaldo made his international debut for Portugal in August 2003, at the age of 18. He has since been capped over 100 times and has participated in 6 major tournaments: three UEFA European Championships (2004, 2008 and 2012) and three FIFA World Cups (2006, 2010 and 2014). Ronaldo is the first Portuguese player to reach 50 international goals, making him Portugal's top goalscorer of all time as well as the country's top scorer in the European Championship with 6 goals. He scored his first international goal in the opening game of Euro 2004 against Greece, and helped Portugal reach the final. He took over captaincy in July 2008, and he led Portugal to the semi-finals at Euro 2012, finishing the competition as joint-top scorer in the process. In November 2014, Ronaldo became the all-time top scorer in the UEFA European Championship (including qualifying) with 23 goals.
Neymar da Silva Santos Júnior (Portuguese pronunciation: [nejˈmaʁ dɐ ˈsiwvɐ ˈsɐ̃tus ˈʒũɲoʁ]; born 5 February 1992), commonly known as Neymar or Neymar Jr., is a Brazilian professional footballer who plays for Spanish club FC Barcelona and the Brazil national team as a forward or winger, and is also the captain of the national team.
At the age of 19, Neymar won the 2011 South American Footballer of the Year award, after coming third in 2010.[5] He followed this up by winning it again in 2012. In 2011 Neymar received nominations for the FIFA Ballon d'Or, where he came 10th, and the FIFA Puskás Award for Goal of the Year, which he won.[6] He is known for his acceleration, dribbling skills, finishing and ability with both feet. His playing style has earned him critical acclaim, with fans, media and former players drawing comparison to former Brazil forward Pelé, who has called Neymar "an excellent player", while Ronaldo, Ronaldinho and Lionel Messi have stated "he will be the best in the world".[7][8][9][10][11]
Neymar joined Santos in 2003 and, aged 17, made his debut for the club in 2009 and was voted the Best Young Player of the 2009 Campeonato Paulista. Neymar was named player of the year as Santos won the 2010 Campeonato Paulista, and he was also top scorer in the 2010 Copa do Brasil with 11 goals. He finished the 2010 season with 42 goals in 60 games, as his club achieved the Double. Neymar was again voted the player of the year in 2011 as Santos retained the state title and also won the 2011 Copa Libertadores securing a Continental Double, Santos' first since 1963. Joining Barcelona in June 2013, in his second season at the club in 2014-15, Neymar helped them win the continental treble of La Liga, Copa del Rey and the UEFA Champions League.
Neymar has represented Brazil at Under-17, Under-20 and senior levels. On 10 August 2010, aged 18, he made his senior debut for Brazil in a friendly match against the United States; he scored in a 2–0 win. Neymar was the leading goal scorer of the 2011 South American Youth Championship with nine goals, including two in the final, in the 6–0 win against Uruguay. He was named in Brazil's squad for the 2013 Confederations Cup, and was assigned the number 10 shirt. On 30 June, Neymar scored Brazil's second goal in the 3–0 final win over Spain. His performances saw him receive the golden ball as player of the tournament.[12] At the 2014 FIFA World Cup, Neymar scored four goals before he suffered a fractured vertebra in his spine in the quarter-finals and missed the rest of the tournament. He received the Bronze Boot as the tournament's third top goalscorer, and was named in the World Cup All Star XI. Neymar captained Brazil at the 2015 Copa América, but in the group stage was given a suspension which ruled him out for the rest of the tournament.
With 46 goals in 67 matches for Brazil, Neymar is the fifth highest goalscorer for his national team.[13] In 2012 and 2013, SportsPro named him the most marketable athlete in the world.[14] In December 2013 he was ranked by The Guardian as the sixth best player in the world.
At the age of 19, Neymar won the 2011 South American Footballer of the Year award, after coming third in 2010.[5] He followed this up by winning it again in 2012. In 2011 Neymar received nominations for the FIFA Ballon d'Or, where he came 10th, and the FIFA Puskás Award for Goal of the Year, which he won.[6] He is known for his acceleration, dribbling skills, finishing and ability with both feet. His playing style has earned him critical acclaim, with fans, media and former players drawing comparison to former Brazil forward Pelé, who has called Neymar "an excellent player", while Ronaldo, Ronaldinho and Lionel Messi have stated "he will be the best in the world".[7][8][9][10][11]
Neymar joined Santos in 2003 and, aged 17, made his debut for the club in 2009 and was voted the Best Young Player of the 2009 Campeonato Paulista. Neymar was named player of the year as Santos won the 2010 Campeonato Paulista, and he was also top scorer in the 2010 Copa do Brasil with 11 goals. He finished the 2010 season with 42 goals in 60 games, as his club achieved the Double. Neymar was again voted the player of the year in 2011 as Santos retained the state title and also won the 2011 Copa Libertadores securing a Continental Double, Santos' first since 1963. Joining Barcelona in June 2013, in his second season at the club in 2014-15, Neymar helped them win the continental treble of La Liga, Copa del Rey and the UEFA Champions League.
Neymar has represented Brazil at Under-17, Under-20 and senior levels. On 10 August 2010, aged 18, he made his senior debut for Brazil in a friendly match against the United States; he scored in a 2–0 win. Neymar was the leading goal scorer of the 2011 South American Youth Championship with nine goals, including two in the final, in the 6–0 win against Uruguay. He was named in Brazil's squad for the 2013 Confederations Cup, and was assigned the number 10 shirt. On 30 June, Neymar scored Brazil's second goal in the 3–0 final win over Spain. His performances saw him receive the golden ball as player of the tournament.[12] At the 2014 FIFA World Cup, Neymar scored four goals before he suffered a fractured vertebra in his spine in the quarter-finals and missed the rest of the tournament. He received the Bronze Boot as the tournament's third top goalscorer, and was named in the World Cup All Star XI. Neymar captained Brazil at the 2015 Copa América, but in the group stage was given a suspension which ruled him out for the rest of the tournament.
With 46 goals in 67 matches for Brazil, Neymar is the fifth highest goalscorer for his national team.[13] In 2012 and 2013, SportsPro named him the most marketable athlete in the world.[14] In December 2013 he was ranked by The Guardian as the sixth best player in the world.
The Mummy (franchise)
The Mummy is the title of several horror-adventure film series centered on an ancient Egyptian priest who is accidentally resurrected, bringing with him a powerful curse, and the ensuing efforts of heroic archaeologists to stop him. These three series of films accompany a spin-off film series, two comic book adaptations, three video games, an animated television series, and a roller coaster ride
The Mummy is the title of several horror-adventure film series centered on an ancient Egyptian priest who is accidentally resurrected, bringing with him a powerful curse, and the ensuing efforts of heroic archaeologists to stop him. These three series of films accompany a spin-off film series, two comic book adaptations, three video games, an animated television series, and a roller coaster ride
nature and nurture
The phrase nature and nurture relates to the relative importance of an individual's innate qualities ("nature" in the sense of nativism or innatism) as compared to an individual's personal experiences ("nurture" in the sense of empiricism or behaviorism) in causing individual differences, especially in behavioral traits. The alliterative expression "nature and nurture" in English has been in use since at least the Elizabethan period[1] and goes back to medieval French.[2] The combination of the two concepts as complementary is ancient (Greek: ἁπό φύσεως καὶ εὐτροφίας[3]).
The phrase in its modern sense was popularized by the English Victorian polymath Francis Galton in discussion of the influence of heredity and environment on social advancement,[4][5] Galton was influenced by the book On the Origin of Species written by his half-cousin, Charles Darwin.
The view that humans acquire all or almost all their behavioral traits from "nurture" was termed tabula rasa ("blank slate") by John Locke in 1690. A "blank slate view" in human developmental psychology assuming that human behavioral traits develop almost exclusively from environmental influences, was widely held during much of the 20th century (sometimes termed "blank-slatism"). The debate between "blank-slate" denial of the influence of heritability, and the view admitting both environmental and heritable traits, has often been cast in terms of nature versus nurture. These two conflicting approaches to human development were at the core of an ideological dispute over research agendas during the later half of the 20th century. As both "nature" and "nurture" factors were found to contribute substantially, often in an extricable manner, such views were seen as naive or outdated by most scholars of human development by the 2000s.[6]
In their 2014 survey of scientists, many respondents wrote that the dichotomy of nature versus nurture has outlived its usefulness, and should be retired. The reason is that in many fields of research, close feedback loops have been found in which "nature" and "nurture" influence one another constantly (as in self-domestication), while in other fields, the dividing line between an inherited and an acquired trait becomes unclear (as in the field of epigenetics or in fetal development).[7][8]
The question of "innate ideas" or "instincts" were of some importance in the discussion of free will in moral philosophy. In 18th-century philosophy, this was cast in terms of "innate ideas" establishing the presence of a universal virtue, prerequisite for objective morals. In the 20th century, this argument was in a way inverted, as some philosophers now argued that the evolutionary origins of human behavioral traits forces us to concede that there is no foundation for ethics (J. L. Mackie), while others treat ethics as a field in complete isolation from evolutionary considerations (Thomas Nagel).[10]
In the early 20th century, there was an increased interest in the role of the environment, as a reaction to the strong focus on pure heredity in the wake of the triumphal success of Darwin's theory of evolution.[11]
During this time, the social sciences developed as the project of studying the influence of culture in clean isolation from questions related to "biology". Franz Boas's The Mind of Primitive Man (1911) established a program that would dominate American anthropology for the next fifteen years. In this study he established that in any given population, biology, language, material and symbolic culture, are autonomous; that each is an equally important dimension of human nature, but that no one of these dimensions is reducible to another.
The tool of twin studies was developed after World War I as an experimental setup intended to exclude all confounders based on inherited behavioral traits. Such studies are designed to decompose the variability of a given trait in a given population into a genetic and an environmental component.
John B. Watson in the 1920s and 1930s established the school of purist behaviorism that would become dominant over the following decades. Watson was convinced of the complete dominance of cultural influence over anything heritability might contribute, to the point of claiming
Robert Ardrey in the 1960s argued for innate attributes of human nature, especially concerning territoriality, in the widely-read African Genesis (1961) and The Territorial Imperative. Desmond Morris in The Naked Ape (1967) expressed similar views. Organised opposition to Montagu's kind of purist "blank-slatism" began to pick up in the 1970s, notably led by E. O. Wilson (On Human Nature 1979). Twin studies established that there was, in many cases, a significant heritable component. These results did not in any way point to overwhelming contribution of heritable factors, with heritability typically ranging around 40% to 50%, so that the controversy may not be cast in in terms of purist behaviorism vs. purist nativism. Rather, it was purist behaviorism which was gradually replaced by the now-predominant view that both kinds of factors usually contribute to a given trait, anecdotally phrased by Donald Hebb as an answer to the question "which, nature or nurture, contributes more to personality?" by asking in response, "Which contributes more to the area of a rectangle, its length or its width?"[14] In a comparable avenue of research, anthropologist Donald Brown in the 1980s surveyed hundreds of anthropological studies from around the world and collected a set of cultural universals. He identified approximately 150 such features, coming to the conclusion there is indeed a "universal human nature", and that these features point to what that universal human nature is.[15]
At the height of the controversy, during the 1970s to 1980s, the debate was highly ideologised. In Not in Our Genes: Biology, Ideology and Human Nature (1984), Richard Lewontin, Steven Rose and Leon Kamin criticise "genetic determinism" from a Marxist framework, arguing that "Science is the ultimate legitimator of bourgeois ideology [...] If biological determinism is a weapon in the struggle between classes, then the universities are weapons factories, and their teaching and research faculties are the engineers, designers, and production workers." The debate thus shifted away from whether heritable traits exist to whether it was politically or ethically permissible to admit their existence. The authors deny this, requesting that that evolutionary inclinations could be discarded in ethical and political discussions regardless of whether they exist or not.[16]
Heritability studies became much easier to perform, and hence much more numerous, with the advances of genetic studies during the 1990s. By the late 1990s, an overwhelming amount of evidence had accumulated that amounts to a refutation of the extreme forms of "blank-slatism" advocated by Watson or Montagu.
This revised state of affairs was summarized in books aimed at a popular audience from the late 1990s. In The Nurture Assumption: Why Children Turn Out the Way They Do (1998), Judith Rich Harris was heralded by Steven Pinker as a book that "will come to be seen as a turning point in the history of psychology".[17] but Harris was criticized for exaggerating the point of "parental upbringing seems to matter less than previously thought" to the implication that "parents do not matter".[18]
The situation as it presented itself by the end of the 20th century was summarized in The Blank Slate: The Modern Denial of Human Nature (2002) by Steven Pinker. The book became a best-seller, and was instrumental in bringing to the attention of a wider public the paradigm shift away from the behaviourist purism of the 1940s to 1970s that had taken place over the preceding decades. Pinker portrays the adherence to pure blank-slatism as an ideological dogma linked to two other dogmas found in the dominant view of human nature in the 20th century, which he termed "noble savage" (in the sense that people are born good and corrupted by bad influence) and "ghost in the machine" (in the sense that there is a human soul capable of moral choices completely detached from biology). Pinker argues that all three dogmas were held onto for an extended period even in the face of evidence because they were seen as desirable in the sense that if any human trait is purely conditioned by culture, any undesired trait (such as crime or aggression) may be engineered away by purely cultural (political means). Pinker focusses on reasons he assumes were responsible for unduly repressing evidence to the contrary, notably the fear of (imagined or projected) political or ideological consequences.
The phrase nature and nurture relates to the relative importance of an individual's innate qualities ("nature" in the sense of nativism or innatism) as compared to an individual's personal experiences ("nurture" in the sense of empiricism or behaviorism) in causing individual differences, especially in behavioral traits. The alliterative expression "nature and nurture" in English has been in use since at least the Elizabethan period[1] and goes back to medieval French.[2] The combination of the two concepts as complementary is ancient (Greek: ἁπό φύσεως καὶ εὐτροφίας[3]).
The phrase in its modern sense was popularized by the English Victorian polymath Francis Galton in discussion of the influence of heredity and environment on social advancement,[4][5] Galton was influenced by the book On the Origin of Species written by his half-cousin, Charles Darwin.
The view that humans acquire all or almost all their behavioral traits from "nurture" was termed tabula rasa ("blank slate") by John Locke in 1690. A "blank slate view" in human developmental psychology assuming that human behavioral traits develop almost exclusively from environmental influences, was widely held during much of the 20th century (sometimes termed "blank-slatism"). The debate between "blank-slate" denial of the influence of heritability, and the view admitting both environmental and heritable traits, has often been cast in terms of nature versus nurture. These two conflicting approaches to human development were at the core of an ideological dispute over research agendas during the later half of the 20th century. As both "nature" and "nurture" factors were found to contribute substantially, often in an extricable manner, such views were seen as naive or outdated by most scholars of human development by the 2000s.[6]
In their 2014 survey of scientists, many respondents wrote that the dichotomy of nature versus nurture has outlived its usefulness, and should be retired. The reason is that in many fields of research, close feedback loops have been found in which "nature" and "nurture" influence one another constantly (as in self-domestication), while in other fields, the dividing line between an inherited and an acquired trait becomes unclear (as in the field of epigenetics or in fetal development).[7][8]
Contents
[hide]History of the debate[edit]
John Locke's An Essay Concerning Human Understanding (1690) is often cited as the foundational document of the "blank slate" view. Locke was criticizing René Descartes' claim of an innate idea of God universal to humanity. Locke's view was harshly criticized in his own time. Anthony Ashley-Cooper, 3rd Earl of Shaftesbury complained that by denying the possibility of any innate ideas, Locke "threw all order and virtue out of the world", leading to total moral relativism. Locke's was not the predominant view in the 19th century, which on the contrary tended to focus on "instinct". Leda Cosmides and John Tooby noted that William James (1842–1910) argued that humans have more instincts than animals, and that greater freedom of action is the result of having more psychological instincts, not fewer.[9]The question of "innate ideas" or "instincts" were of some importance in the discussion of free will in moral philosophy. In 18th-century philosophy, this was cast in terms of "innate ideas" establishing the presence of a universal virtue, prerequisite for objective morals. In the 20th century, this argument was in a way inverted, as some philosophers now argued that the evolutionary origins of human behavioral traits forces us to concede that there is no foundation for ethics (J. L. Mackie), while others treat ethics as a field in complete isolation from evolutionary considerations (Thomas Nagel).[10]
In the early 20th century, there was an increased interest in the role of the environment, as a reaction to the strong focus on pure heredity in the wake of the triumphal success of Darwin's theory of evolution.[11]
During this time, the social sciences developed as the project of studying the influence of culture in clean isolation from questions related to "biology". Franz Boas's The Mind of Primitive Man (1911) established a program that would dominate American anthropology for the next fifteen years. In this study he established that in any given population, biology, language, material and symbolic culture, are autonomous; that each is an equally important dimension of human nature, but that no one of these dimensions is reducible to another.
The tool of twin studies was developed after World War I as an experimental setup intended to exclude all confounders based on inherited behavioral traits. Such studies are designed to decompose the variability of a given trait in a given population into a genetic and an environmental component.
John B. Watson in the 1920s and 1930s established the school of purist behaviorism that would become dominant over the following decades. Watson was convinced of the complete dominance of cultural influence over anything heritability might contribute, to the point of claiming
- "Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select – doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors." (Behaviorism, 1930, p. 82)
- "Man is man because he has no instincts, because everything he is and has become he has learned, acquired, from his culture [...] with the exception of the instinctoid reactions in infants to sudden withdrawals of support and to sudden loud noises, the human being is entirely instinctless."[12]
Robert Ardrey in the 1960s argued for innate attributes of human nature, especially concerning territoriality, in the widely-read African Genesis (1961) and The Territorial Imperative. Desmond Morris in The Naked Ape (1967) expressed similar views. Organised opposition to Montagu's kind of purist "blank-slatism" began to pick up in the 1970s, notably led by E. O. Wilson (On Human Nature 1979). Twin studies established that there was, in many cases, a significant heritable component. These results did not in any way point to overwhelming contribution of heritable factors, with heritability typically ranging around 40% to 50%, so that the controversy may not be cast in in terms of purist behaviorism vs. purist nativism. Rather, it was purist behaviorism which was gradually replaced by the now-predominant view that both kinds of factors usually contribute to a given trait, anecdotally phrased by Donald Hebb as an answer to the question "which, nature or nurture, contributes more to personality?" by asking in response, "Which contributes more to the area of a rectangle, its length or its width?"[14] In a comparable avenue of research, anthropologist Donald Brown in the 1980s surveyed hundreds of anthropological studies from around the world and collected a set of cultural universals. He identified approximately 150 such features, coming to the conclusion there is indeed a "universal human nature", and that these features point to what that universal human nature is.[15]
At the height of the controversy, during the 1970s to 1980s, the debate was highly ideologised. In Not in Our Genes: Biology, Ideology and Human Nature (1984), Richard Lewontin, Steven Rose and Leon Kamin criticise "genetic determinism" from a Marxist framework, arguing that "Science is the ultimate legitimator of bourgeois ideology [...] If biological determinism is a weapon in the struggle between classes, then the universities are weapons factories, and their teaching and research faculties are the engineers, designers, and production workers." The debate thus shifted away from whether heritable traits exist to whether it was politically or ethically permissible to admit their existence. The authors deny this, requesting that that evolutionary inclinations could be discarded in ethical and political discussions regardless of whether they exist or not.[16]
Heritability studies became much easier to perform, and hence much more numerous, with the advances of genetic studies during the 1990s. By the late 1990s, an overwhelming amount of evidence had accumulated that amounts to a refutation of the extreme forms of "blank-slatism" advocated by Watson or Montagu.
This revised state of affairs was summarized in books aimed at a popular audience from the late 1990s. In The Nurture Assumption: Why Children Turn Out the Way They Do (1998), Judith Rich Harris was heralded by Steven Pinker as a book that "will come to be seen as a turning point in the history of psychology".[17] but Harris was criticized for exaggerating the point of "parental upbringing seems to matter less than previously thought" to the implication that "parents do not matter".[18]
The situation as it presented itself by the end of the 20th century was summarized in The Blank Slate: The Modern Denial of Human Nature (2002) by Steven Pinker. The book became a best-seller, and was instrumental in bringing to the attention of a wider public the paradigm shift away from the behaviourist purism of the 1940s to 1970s that had taken place over the preceding decades. Pinker portrays the adherence to pure blank-slatism as an ideological dogma linked to two other dogmas found in the dominant view of human nature in the 20th century, which he termed "noble savage" (in the sense that people are born good and corrupted by bad influence) and "ghost in the machine" (in the sense that there is a human soul capable of moral choices completely detached from biology). Pinker argues that all three dogmas were held onto for an extended period even in the face of evidence because they were seen as desirable in the sense that if any human trait is purely conditioned by culture, any undesired trait (such as crime or aggression) may be engineered away by purely cultural (political means). Pinker focusses on reasons he assumes were responsible for unduly repressing evidence to the contrary, notably the fear of (imagined or projected) political or ideological consequences.
Subscribe to:
Posts (Atom)