Tuesday, 30 December 2014

The Changing Image of Witches

As I have previously mentioned, as part of my history AS Level I studied the European Witch Craze (1560-1640). It was one of favourite papers as I found the concept of a 'craze' fascinating. The witch hunts were highly localised, had varied causes and were mostly experienced as individual events, resulting in, on average, under 10 executions; the 'witch craze' was something subsequent historians have determined.

In order to pursue my interest in the European Witch Craze I visited the British Museum to see the 'Witches and Wicked Bodies' exhibition. The exhibition examines the portrayal of witches and witchcraft in art from the Renaissance to the end of the 19th century.What interested me most about the exhibition is the changing visual iconography of witches. In the Middle Ages, witches were often portrayed as naked women, riding goats and surrounded by skulls and other symbols of death. Witches were pictured at sabbaths, raising the dead within magic circles, surrounded by demons or performing evil spells. They were often imagined riding on dragons and other beasts.

However by the end of the 19th century hideous old hags with distended breasts and snakes for hair were mostly replaced by sexualised and mysteriously exotic images of feminine evil. And by the 20th century this had developed into images of women with broomsticks, cats and elongated facial features. Modern witches come in all shapes and sizes (often quite beautiful), portrayed in a multitude of ways, however anyone who has watched Harry Potter will be familiar with the idea of cats and pointed hats as representations of witches. Modern imagery surrounding witches is focused more on childish entertainment than spreading fear, propaganda and warnings of witchcraft, which was the purpose of art portraying witches or witchcraft  in the Middle Ages.

The use of art was crucial in helping the European Witch Hunts develop into a Craze, as support from the local populace was needed in order for witchcraft trials to be successfully carried out, and word of mouth and art were the main methods of spreading ideas amongst a mostly illiterate population. Changing representations of witches in art may also have helped contribute to the end of the Witch Craze as images of witches changed from being harrowing and fear-inciting, to sexualised and intriguing.

To anyone who lives in London or can travel to London easily, I would highly recommend visiting the exhibition!

Saturday, 20 December 2014

Myths about WWI

As I recently mentioned in my blog on myths about elephants, I find it very interesting how myths retold over time become 'facts'0 simply because they have withstood the test of time, which attaches some authority to them.

I read a very interesting article on myths about WWI recently, and how much we think we know about the conflict is actually wrong. The article was also very thought provoking as it argued that by setting WWI apart as uniquely awful we are blinding ourselves to the reality of war and are in danger of belittling the experience of soldiers and civilians caught up in countless other appalling conflicts throughout history and the present day. Here is a summary of the article:

One of the most famous 'myths' about WWI is that it was the bloodiest war in history to that point, the 'war to end all wars'. However that is actually untrue. Fifty years before WW1 broke out, southern China was torn apart by the 14 year Taiping rebellion, where between 20-30 million died, compared to 17 million soldiers and civilians who lost their lives during WWI. It is also important to look at deaths as a percentage of population rather than just a number, as although more Britons died in WW1 than any other conflict, the bloodiest war in our history relative to population size is the Civil War: 4% compared to 2%.

It is also untrue that men lived in the trenches for years on end. Front-line trenches were very unpleasant: cold, wet and exposed to the enemy, meaning soldiers would quickly lose their morale if they spent too much time in the trenches. As a result, the British army rotated men in and out continuously. Between battles, a unit spent perhaps 10 days a month in the trench system and, of those, rarely more than three days right up on the front line. During moments of crisis the British could occasionally spend up to seven days on the front line but were far more often rotated out after just a day or two.

Another myth that the article debunked is that most soldiers died. In fact, in the UK around six million men were mobilised, and of those just over 700,000 were killed: 11.5%. As a British soldier you were more likely to die during the Crimean War (1853-56) than in WW1.

And although the great majority of casualties in WW1 were from the working class, the upper class did not get off lightly: the social and political elite were hit disproportionately hard by WW1. Their sons provided the junior officers whose job it was to lead the way over the top and expose themselves to the greatest danger as an example to their men. Some 12% of the British army's ordinary soldiers were killed during the war, compared with 17% of its officers. Eton alone lost more than 1,000 former pupils - 20% of those who served. UK wartime Prime Minister Herbert Asquith lost a son, while future Prime Minister Andrew Bonar Law lost two. Anthony Eden lost two brothers, another brother of his was terribly wounded, and an uncle was captured.

The famous saying 'lions led by donkeys' (i.e. that British commanders were thrust into a massive industrial struggle unlike anything the Army had ever seen) is also inaccurate. Whilst some generals did struggle, many generals were brilliant. During the war more than 200 generals were killed, wounded or captured. Most visited the front lines every day. In battle they were considerably closer to the action than generals are today. Rarely in history have commanders had to adapt to a more radically different technological environment. British commanders had been trained to fight small colonial wars; now they were thrust into a massive industrial struggle unlike anything the British army had ever seen. Despite this, within three years the British had effectively invented a method of warfare still recognisable today. By the summer of 1918 the British army was probably at its best ever and it inflicted crushing defeats on the Germans.

Another myth is that tactics on the Western Front remained unchanged despite repeated failure. In reality tactics and technology changed radically: it was a time of extraordinary innovation. In 1914 generals on horseback galloped across battlefields as men in cloth caps charged the enemy without the necessary covering fire. Both sides were overwhelmingly armed with rifles. Four years later, steel-helmeted combat teams dashed forward protected by a curtain of artillery shells. They were now armed with flame throwers, portable machine-guns and grenades fired from rifles. Above, planes, which in 1914 would have appeared unimaginably sophisticated, duelled in the skies, some carrying experimental wireless radio sets, reporting real-time reconnaissance. Huge artillery pieces fired with pinpoint accuracy - using only aerial photos and maths they could score a hit on the first shot. Tanks had gone from the drawing board to the battlefield in just two years, also changing war for ever.

Whilst it is odd to talk about winning WWI (swathes of Europe lay wasted, millions were dead or wounded. Survivors lived on with severe mental trauma. The UK was broke), in a narrow military sense, the UK and its allies convincingly won. Germany's battleships had been bottled up by the Royal Navy until their crews mutinied rather than make a suicidal attack against the British fleet. Germany's army collapsed as a series of mighty allied blows scythed through supposedly impregnable defences. By late September 1918 the German emperor and his military mastermind Erich Ludendorff admitted that there was no hope and Germany must beg for peace. The 11 November Armistice was essentially a German surrender. Unlike Hitler in 1945, the German government did not insist on a hopeless, pointless struggle until the allies were in Berlin - a decision that saved countless lives, but was seized upon later to claim Germany never really lost.

A very common misconception is that the Treaty of Versailles was extremely harsh. Whilst the Treaty of Versailles confiscated 10% of Germany's territory it left it the largest, richest nation in central Europe. It was largely unoccupied and financial reparations were linked to its ability to pay, which mostly went unenforced anyway. The treaty was notably less harsh than treaties that ended the 1870-71 Franco-Prussian War and World War Two. The German victors in the former annexed large chunks of two rich French provinces, part of France for between 200 and 300 years, and home to most of French iron ore production, as well as presenting France with a massive bill for immediate payment. After WW2 Germany was occupied, split up, its factory machinery smashed or stolen and millions of prisoners forced to stay with their captors and work as slave labourers. Germany lost all the territory it had gained after WW1 and another giant slice on top of that. Versailles was not harsh but was portrayed as such by Hitler, who sought to create a tidal wave of anti-Versailles sentiment on which he could then ride into power.
Finally, not everyone hated WWI. Whilst some witnessed unimaginable horrors that left them mentally and physically incapacitated for life, other soldiers enjoyed WW1. If they were lucky they would avoid a big offensive, and much of the time conditions might be better than at home.
For the British there was meat every day - a rare luxury back home - cigarettes, tea and rum, part of a daily diet of more than 4,000 calories. Remarkably, absentee rates due to sickness, an important barometer of a unit's morale, were hardly above those of peacetime. Many young men enjoyed the guaranteed pay, the intense comradeship, the responsibility and a much greater sexual freedom than in peacetime Britain.

source: http://www.bbc.co.uk/news/magazine-25776836

Monday, 15 December 2014

When Was The First Total War?

I am very interested in the concept of 'total war', whether such wars can and do exist, and if so, which wars have been 'total', and which, like the wars in Afghanistan (2001) and Iraq (2001) have been limited. The phrase 'total war' is a twentieth century one but the concept itself is much older. I think the French Revolutionary Wars were the first modern total war, but the Peloponnesian War is the first pre-modern total war. Carl von Clausewitz first articulated the concept of total war when he discussed how wars could not be fought within limits or by laws, as the logic of war is 'absolute' and demands that all available resources are deployed, in his book Von Kriege in 1966. Clausewitz said that Napolean had changed the nature of war and that the French Revolutionary Wars mark the point in which wars became the business of the people as opposed to the 'hobby of kings'. Clausewitz also came up with the famous saying "war is a continuation of political intercourse, carried on with other means".

In a total war there is a full mobilisation of available resources and population to the war effort, there is no real distinction between civilians and combatants, they are prolonged and bloody (because they only end with 'total surrender') and there are no 'rules of war'. These criteria fit, to some extent, the French Revolutionary Wars from 1792 to 1815, where the whole of society was mobilised in that the wars were seem as 'people's wars' and all Frenchmen were in permanent requisition for the services of the armies. Almost 3 million men were enlisted in the army between 1792 and 1815, and almost 1.7 million of them died. The use of 'populidices' was seen by the massacre at Vendee, where 200,000 people were slaughtered. Prior to the French Revolutionary Wars wars were relatively easy to control and restrain: armies were small, major battles infrequent, civilians well treated: they were the wars of kinds, fought by professional armies who respected their opponents as men of honour.

The first pre-modern total war which we have enough evidence for is, I believe, the Peloponnesian War, 431 - 404 BC. Sparta was a society which was organised for war: weak babies were abandoned at birth, young men and women went through a brutal military training as they grew up and all male citizens were enrolled in the army from 20-30 years of age. Likewise Athens focused on their Navy, which had 300 triremes, almost 8 times the size of the next largest naval power and almost the same size as today's US Navy, which supports a population which is 1,500 times larger. In the Peloponnesian War, all citizens were considered targets, seen by senseless massacres of children and no mercy towards prisoners of war. The Peloponnesian War violated the harsh code of previous Greek warfare (previously Greek warfare had been relatively controlled and formalised, where opposing armies would rush at each other and engage in hand to hand battle until one side surrendered). However the Spartans abandoned this strategy, destroying whole cities and butchering entire populations. The war only ended in total surrender, and destroyed the democratic institutions of Athens.

Wednesday, 10 December 2014

North Korea on the USA

Something I find very interesting is how events like the fall of the Roman Empire can be reused again and again over time to make current points. This was seen in Gibbon's Decline and Fall (1776), where his book on the Roman Empire served as a warning to contemporary British Empire builders against the 'feminisation' of the Empire. A more recent example from North Korea is a great illustration of how the Roman Empire can be used to make contemporary analogies. North Korea likened the fate of the US to the fall of the Roman Empire.

Here is the article:

In Australia last month, President Barack Obama spoke about the present day’s place in history.

“I often tell young people in America that, even with today’s challenges, this is the best time in history to be alive,” the president said at the University of Queensland. The president’s 15 November speech was followed by one by then-Secretary of Defense Chuck Hagel, who explained how the US military would need to reform to keep its place in history.

Two weeks later, in a commentary published by the North’s state-run Korean Central News Agency (KCNA), the speeches are described as the “poor shriek of those facing ruin” and a “recognition of the dark reality in the US as it is a reflection of extreme uneasiness and horror-phobia”.

As KCNA puts it, the speeches remind “one of the old Roman empire that was buried in history after facing a ruin for coveting for prosperity through aggression and wars.

“The poor fate of the US reminiscent of the ruin of the Roman empire is a due outcome of its history of aggression and arbitrary practices,” the article repeats.

North Korea is, of course, no stranger to hurling inventive insults towards Americans. In 2009, KCNA reported that a North Korean official had labelled Hillary Clinton a “funny lady” who was “by no means intelligent”.Earlier this year, in light of a UN report on human rights in North Korea, the country released its own human rights report on the US, concluding it was “a living hell”.

The comparison to the Roman empire is slightly unusual, however: North Korean propaganda rarely steps outside its own mythology to think about other histories. Defectors from the country have said they are taught a “general history of the world” in schools, though the emphasis appears to be on 20th-century history.

Comparing the US to the Roman empire is hardly a new concept – it’s been around for decades if not far longer. “Americans have been casting eyes back to ancient Rome since before the revolution,” Cullen Murphy wrote in his 2008 book Are We Rome?

Murphy points out that most of the American allusions to Rome ignore the complexity of history (some historians argue that Rome didn’t really fall), and depending on who is doing the talking, Rome “serves as either a grim cautionary tale or an inspirational call to action”.

In recent years, with America wracked by internal political divisions, economic uncertainty and geopolitical enemies of all sorts around the world, the former option seems to have become more popular – especially among America’s geopolitical rivals. For the ascendant ones, it seems like good news: China’s CCTV state broadcaster commissioned a huge TV series titled The Rise of the Great Powers a few years ago, in effect announcing China’s ascendance and declaring the end of America.

KCNA’s commentary takes a darker angle. “The US is now thrown into confusion as its unipolar domination system called world order is getting out of control,” the unnamed writer notes, later pointing to the problems in Ukraine and the Middle East. “Its military muscle and dollar’s position that have propped Washington’s moves for world domination are now sinking rapidly.”

“The US has now the hardest time in its history,” KCNA reported.

It’s a grim assessment, but it poses the question: if the US is the Roman empire, who is North Korea? The Vandals or some more obscure German tribe? According to James Romm, a professor of classics at Bard College in New York, the hermit kingdom resembles ... the Roman empire.

“In the century or so following Caesar’s assassination, his successors achieved a power so absolute that they were worshipped as gods on Earth, as the Kims are today,” Romm wrote for the Los Angeles Times earlier this year. “Yet they, again like the Kims, suffered chronic insecurity about their legitimacy, and that fear led to terrible abuses.”

I think this shows very well how different parties can manipulate history in order to suit their present day concerns, 

source: http://www.theguardian.com/world/2014/dec/08/north-korea-us-roman-empire?CMP=twt_gu

Friday, 5 December 2014

Should the Elgin Marbles be Returned to Greece?

Today it was announced that the Elgin Marbles have left London for the first time, on loan by the British Museum to the State Hermitage Museum in St. Petersburg. The Hermitage was founded in 1764 by Catherine the Great to enable Russia to participate in the European Enlightenment. The politics of the move are very interesting (Neil MacGregor, director of the British Museum has said that the relationship between the two museums is all the more important considering fears of a new Cold War between the Kremlin and the west). The British Museum has used history to improve cultural links with China, Afghanistan, Africa and the Middle East. For example, four years ago the Cyrus Cylinder, a small Persian clay tablet often described as the earliest charter of human rights, was loaned to Iran for an exhibition at which it was seen by nearly 500,000 people. 

However there is a controversy over the statues which I wish to talk about today. Whilst visiting the British Museum a few weeks ago I was intrigued by the idea, held by some, that most of the objects in the museum aren't actually owned by the British (they were 'stolen' centuries ago) and therefore ought to be returned.

Greece refuses to recognise the British Museum’s ownership of the Elgin Marbles, which make up about 30% of the surviving decoration from the Parthenon. The classical marbles – officially known as the Parthenon sculptures – were brought from the Parthenon in Athens by Lord Elgin in the early 19th century. They were acquired by parliament in 1816. The museum maintains that the sculptures’ reputation as art rather than decoration was forged in London and that they can best be understood in the context of Western civilisation by remaining in the museum. However for the past 40 years Athens has argued that they belong in Greece alongside the other remaining fragments, in a museum with a view of the Acropolis, where the ruined Parthenon stands. Amal Clooney has argued that "the Elgin Marbles, like fox-hunting, represent an overbearing past", that Britain is growing out of. 

The Greeks claim that even if Lord Elgin, ambassador to the Ottoman Empire, had bought the statues legally, at the time Greece was under Turkish occupation, meaning the rulers of the day may have agreed the deal, but the Greek people didn’t. Some people argue that had the sculptures stayed in Athens they might have been ground down to produce lime and used as rubble for building foundations. 

Ultimately, the Greek's argument has much larger connotations, for if the Greeks can prove their argument is right hundreds of other artifacts would have to be removed from museums around the world and sent back to where they originally came from. The Venus de Milo – also removed from Greece during the Ottoman empire – would have to leave the Louvre. The Tipu’s Tiger would have to be returned to Delhi from the V&A. The magnificent Assyrian galleries at the British Museum would be sent back to Baghdad. And there is another problem; countries and borders have changed significantly over time. For example, where would the great altar removed from the temple at Pergamon to Berlin go? Pergamon is in Turkey, but was once part of imperial Greece, imperial Rome, imperial Persia and imperial Byzantium - so who does it belong to? 


Understanding history recognises that things always change and that all actions are the product of their time. Sir John Boardman has warned that the return of Elgin Marbles would 'ruin' museums, in an "appalling precedent". Both David Cameron and the British Museum have firmly opposed calls for their return. “The Parthenon sculptures in London are an important representation of ancient Athenian civilisation in the context of world history,” a museum spokesman said. And, more importantly in my opinion, the removal of the sculptures to Greece would have a knock-on effect for museums around the world, ultimately causing far more damage than good. 

Tuesday, 2 December 2014

Exciting Richard III News!

Today I read an article that I found very interesting and exciting, especially as a 16th century historian. Despite the absence of a male-line genetic match, DNA results came back with a 99.999% probability that the body found in a Leicester carpark several years ago was that of the Plantagenet king, Richard III. Secondly, the break in the male line is evidence of infidelity in his family tree. This could have profound historical implications, depending on where in the family tree it occurred, as it could cast doubt on the Tudor claim to the English throne or, indeed, on Richard's. Some are even questioning if the findings have an impact on the current Royal Family's legitimacy, but this cannot be commented on as it is still unknown when the break, or breaks, in the lineage occurred.

In 2012, scientists extracted genetic material from the remains discovered on the former site of Greyfriars Abbey, where Richard was buried after his death in the Battle of Bosworth in 1485. Their analysis shows that DNA passed down on the maternal side matches that of living relatives, but genetic information passed down on the male side does not. Given the amount of additional evidence linking the body to Richard III, scientists have concluded that infidelity is the most likely explanation for this break. For example, the curvature in the skeleton's spine matches contemporary descriptions of  Richard having one shoulder higher than the other. Genes involved in hair and eye colour also suggest that Richard III had blue eyes, matching one of the earliest known paintings of the king. However, the hair colour analysis gave a 77% probability that the individual was blond, which does not match the depiction. But the researchers say the test is most closely correlated with childhood hair, and in some blond children, hair darkens during adolescence."If you put all the data together, the evidence is overwhelming that these are the remains of Richard III," said Dr Turi King from Leicester University, who led the study. She said that the lack of a match on the male side was not unexpected, because her previous research had shown there was a 1-2% rate of "false paternity" per generation. The instance of female infidelity could have occurred anywhere in the generations that separate Richard III from the 5th Duke of Beaufort (1744-1803), whose living descendants provided samples of male-line DNA to be compared against that of the Plantagenet king.

"We may have solved one historical puzzle, but in so doing, we opened up a whole new one” said Professor Kevin Schurer of the University of Leicester.

Richard III and Henry Tudor (later Henry VII), were both descendants of King Edward III. The infidelity could, in theory, have occurred either on the branch leading back from Henry to Edward or on the branch leading from Richard to Edward. Henry's ancestor John of Gaunt was plagued by rumours of illegitimacy throughout his life, apparently prompted by the absence of Edward III at his birth. He was reportedly enraged by gossip suggesting he was the son of a Flemish butcher. "Hypothetically speaking, if John of Gaunt wasn't Edward III's son, it would have meant that (his son) Henry IV had no legitimate claim to the throne, nor Henry V, nor Henry VI," said Dr Schurer.

Richard's maternal-line - or mitochondrial - DNA was matched to two living relatives of his eldest sister Anne of York. Michael Ibsen and Wendy Duldig are 14th cousins and both carry the same extremely rare genetic lineage as the body in the car park.

Richard III was defeated in battle by Henry Tudor, marking the end of the Plantagenet dynasty and the beginning of Tudor rule, which lasted until Queen Elizabeth I died childless in 1603. Richard's battered body was subsequently buried in Greyfriars. 


source: http://www.bbc.co.uk/news/science-environment-30281333

Friday, 28 November 2014

Richard Evans on Boris Johnson

One thing I really enjoy doing is reading book reviews. I recently read a review by Richard J Evans on Boris Johnson's 'The Churchill Factor: How One Man Made History ', which made me laugh a lot. Evans criticised Johnson for fabricating/ stretching his facts in order to support his hypothesis; that Churchill shaped the 20th century and the outcome of WWII. The article made me think a lot about the use of facts to support historical research, and the use of history to make present day points (Evans accused Johnson of trying to draw subtle comparisons between himself and Churchill - Johnson may be running for prime minister in the near future after all!). 

I have put a link of the article below - please read it and let me know what you all think!

http://www.newstatesman.com/books/2014/11/one-man-who-made-history-another-who-seems-just-make-it-boris-churchill

Thursday, 20 November 2014

The Elephant




It is very interesting how old myths and folk tales from thousands of years ago can develop into common 'facts' we all know and accept in modern times. One such example is the elephant, which has been misunderstood for thousands of years.

I recently read a very interesting article on historical misconceptions about the elephant, which I have summarised below.

Elephants were named after Japan's 4th emperor, Emperor Itoku. Rumour had it that his nose was so long he could swing it like a lasso, throw it into the river and drink via his nostrils. Whilst this was one of his greatest attributes, it would also prove to be his downfall; when the current was surprisingly strong in the winter of 476 BC, his nose became trapped in the powerful movement and he was swept off, never to be seen again.

In the song “Heffalumps and Woozles” from Winnie the Pooh, elephants wear tuxedos and are blue. Fantastical, but unimaginative compared to what European natural historians used to believe about the elephant. These misconceptions persisted despite many scholars and ordinary citizens knowing otherwise! 

Some argue that misconceptions about the elephant are reasonable seeing as they come from Africa and Asia, and Europeans simply didn’t have enough contact with the animals to inform their judgements. The reality, though, is that the elephant has an old (and brutal) history in Europe.

One such misconception is that elephants have no knees. However 17th century historian Sir Thomas Browne noted that even the ancient Greeks knew that elephants had knees. Western armies first fought elephants at the Battle of Gaugamela in present-day northern Iraq in 331 BC, between Alexander the Great and Darius of Persia. War elephants were an incredibly effective weapon in ancient warfare - if you could manage to feed them and keep them from stepping on your own soldiers. But Alexander overcame the Persians, captured their elephants, and sent the beasts back to Greece. Having seen elephants for themselves, the Greeks had no reason to doubt their possession of knees.

Indeed, Aristotle’s description of the elephant is incredibly descriptive, and he is certain that they had knees. During the Roman Empire, most citizens knew elephants had knees because they attended gladiatorial games to watch the cruel slaughter of the animals hauled in from Africa. In fact, in his encyclopedia Natural History, Pliny the Elder specifically mentions the elephant’s knees when recounting a battle in Rome’s Circus between 20 elephants and a number of men: “One of these animals fought in a most astonishing manner; being pierced through the feet, it dragged itself on its knees towards the troop, and seizing their bucklers, tossed them aloft into the air: and as they came to the ground they greatly amused the spectators, for they whirled round and round in the air, just as if they had been thrown up with a certain degree of skill, and not by the frantic fury of a wild beast.”

Yet even after Pliny and Aristotle made it clear that elephants had knees, the idea spread again after the fall of the Roman Empire thanks to the Physiologus, which was written between the second and third centuries AD. It says “The elephant has no knee joints enabling him to sleep lying down if he wanted to”. However by the 13th century this was being challenged. Albertus Magnus, for instance, noted that the elephant wasn’t missing knees - it just had stiff joints in its legs, necessary to support its weight.

The Physiologus also introduced several other other fanciful elephant “facts” that came from the ancient world. The book describes how when elephants want to conceive, a male and a female must head “to the east near paradise,” where they’ll find the mandrake. This plant’s root has a long history of being used by humans as an aphrodisiac, though as a member of the nightshade family it contains powerful toxins that can kill in high doses. The elephants supposedly eat some of it, and “the female immediately conceives in the womb.”

According to the Physiologus the elephant's eternal enemy, the serpent, would try and snatch up baby elephants. Pliny refers to it as a dragon: "this fiend is perpetually at war with the elephant,” he writes, “and is itself of so enormous a size, as easily to envelop the elephants with its folds, and encircle them in its coils. The contest is equally fatal to both; the elephant, vanquished, falls to the earth, and by its weight, crushes the dragon which is entwined around it.” In the Physiologus, the serpent seems decidedly smaller, as the male elephant “kills it by trampling on it until it dies.”

Pliny also wrote that elephants “hate mice and will refuse to eat fodder that has been touched by one”. As a researcher of elephant behaviour in 2011 said “In the wild, anything that suddenly runs or slithers by an elephant can spook it. It doesn’t have to be a mouse—dogs, cats, snakes or any animal that makes sudden movements by an elephant’s feet can startle it.”

This is because they have incredibly poor eyesight, relying instead on a highly developed sense of smell. Venomous snakes can kill baby elephants, so evolutionarily it they have developed to be cautious about small creatures. In captivity at least, they don’t seem to be the afraid of mice once they've established that they are harmless, and have even been known to stomp on mice scurrying around their pens, according to Gordon Grice in The Book of Deadly Animals.

The phrase 'elephants never forget' is again a myth that many still believe to this day. Whilst elephants have excellent memory, the statement is still an exaggeration. As elephants are nomadic, they have to remember where water and fodder are located - elephants always know how to return to a watering hole they used previously.

References


http://www.wired.com/2014/11/fantastically-wrong-misconceptions-about-the-elephant/

Badke, D. (2011) The Elephant. The Medieval Bestiary

Brown, T. (1894) The Works of Sir Thomas Browne. Tuft’s Perseus Digital Library

Curley, M. (2009) Physiologus: A Medieval Book of Nature Lore. University Of Chicago Press

Pliny the Elder. (1855) The Natural History. Taylor and Francis, Red Lion Court, Fleet Street. Retrieved from Tuft’s Perseus Digital Library

Wednesday, 19 November 2014

Picture Of The Month

Embedded image permalink

Here is a photo of some men taking a selfie in the 1920s. It shows how little has changed over nearly a century!

Saturday, 15 November 2014

The Fall of the Berlin Wall

Historians can tell you pretty much everything about the fall of the Berlin Wall. Why it happened, where it happened, how it happened. The recent 25th anniversary of the Fall of the Berlin Wall made me think about whether there is anything that a person who was there at the time knows that historians and later generations do not.

Most obviously, people who were there will know what it felt like at the time. The true drama of 9 November 1989 is hard to recapture. One obvious reason for this is that the vast majority of photos and videos only show the side of the Wall covered in colourful graffiti - the western side, the free side, the one that already enjoyed freedom of expression (hence all the colourful graffiti). Yet it was the other side of the concrete barrier that mattered, the side that people had risked their lives to climb over.

The fall of the Berlin Wall was a momentous occasion for West Berliners, and for West Germans, but it was not the day of unification. That came nearly a year later, on 3 October 1990, after a majority of East Germans had voted to join West Germany, and Helmut Kohl and George HW Bush had skilfully negotiated it with Gorbachev. The fall of the Wall was a day of liberation, for those behind the Wall, not the day of unification for those in front of it. The emotional quality of this liberation can only be captured if you can imagine what it was like to live behind that “anti-fascist protection rampart” (its official name) for all your life, never setting foot in the western half of your own city, and with the expectation that this would continue for years to come.

Here is a second thing that historians struggle to recapture: the sense of what people at the time did not know. To those who lived behind it, the Berlin Wall had become a seemingly unchangeable fact of physical geography. Even when things began to change so dramatically in Poland and Hungary, most people just did not believe the Wall could crumble. After all, there was a nuclear-armed empire holding it up. However much as the historian warns against the problems of having hindsight, you simply cannot un-know your knowledge of what came afterwards. So even if you do not fall into the trap of writing history as if that which actually happened somehow had to happen – what Henri Bergson called “the illusions of retrospective determinism” – it is almost impossible to recreate the emotional intensity of the moment of liberation. For that intensity came from having lived for most, if not all, your life with the certainty that the Wall was permanent. Some did believe that the Wall would come down and Germany would be united, but none foresaw when and, above all, how.

For example there is the story of a former MI6 man, on the evening of 9 November, who was meeting with his colleagues from the West German foreign intelligence service, the Bundesnachrichtendienst. The West German spies were in the middle of telling the British spooks, clearly on the basis of their excellent East German sources, that change in East Germany would only come very slowly, perhaps in a matter of years, when someone put his head round the door and said: “Turn on the television: the Wall’s open!”

In many ways 1989 has become the new 1789: both a turning point and a reference point. Twenty-five years on, it has given us what is, politically, the best Germany we have ever had. (Culturally, other Germanies have been more interesting, but today's Germany is democratic, rich, and supporting the rest of Europe.) It has made possible the Europe we have today and there is no corner of the world its consequences have not touched. Those consequences have been of two kinds: the direct results of what actually happened, and the ways in which people read and misread it, which themselves produce unintended consequences.The fall of the Wall has become a kind of master metaphor of our age, used especially by western politicians, not just to represent, but to predict, the forward march of freedom.

So where are the people who were around in 1989? It is not that this generation has been silent: Edward Snowden, who was six when the Wall came down, can be seen as both a voice and a hero of that generation. But it is not yet clear what broader political vision this generation represents, how it will change Europe and whether it will appeal to a wider world. Indeed, if it is to succeed, this cannot just be a western generation: the generations in Beijing, Delhi and São Paolo are just as important.

It is hard to tell whether the people around in 1989 will come together as a defining political generation and how they will act. But one thing is clear: their action (or inaction) will determine how we read the Wall’s fall on its 50th anniversary. On them will depend the future of our past.

This blog was inspired by this article I read. I would highly reccomend reading it. http://www.theguardian.com/world/2014/nov/06/-sp-fall-berlin-wall-what-it-meant-to-be-there

Saturday, 8 November 2014

Demagogues Through History

A few days ago I wrote a blog on McCarthyism, whom many say is an example of an American demagogue. Therefore I thought today it would be appropriate to dedicate this blog to demagogues throughout history!

A demagogue (people's manipulator/ rabble-rouser) is a political leader in a democracy who appeals to the emotions, fears, prejudices, and ignorance of the lower classes in order to gain power and promote political motives. Demagogues usually oppose deliberation and advocate immediate, violent action to address national crises, accusing moderate and thoughtful opponents of weakness. Demagogues have appeared in democracies since ancient Athens. They exploit a fundamental weakness in democracy: because ultimate power is held by the people, they can chose to give that power to whoever they want, even if he/she is only popular with the majority of the lower classes.

The word demagogue, meaning a leader of the common people, first arose in ancient Greece with no negative connotation, but eventually came to mean a troublesome kind of leader who occasionally arose in Athenian democracy. Even though democracy gave power to the common people, in Athens elections still tended to favour the aristocratic class, which favoured deliberation and decorum. Demagogues were a new kind of leader who emerged from the lower classes. Demagogues relentlessly advocated action, usually violent, immediately and without deliberation. Demagogues appealed directly to the emotions of the poor and uninformed, pursuing power, telling lies to stir up hysteria, exploiting crises to intensify popular support for their calls to immediate action and increased authority, and accusing moderate opponents of weakness or disloyalty to the nation. While all politicians in a democracy must make occasional small sacrifices of truth, subtlety, or long-term concerns to maintain popular support, demagogues do these things relentlessly and without self-restraint.

Through their popular appeal, demagogues exploit the freedom secured under democracy to gain a level of power for themselves that overrides the rule of law, thereby undermining democracy. The Greek historian Polybius thought that democracies are inevitably undone by demagogues. He said that every democracy eventually decays into "a government of violence and the strong hand," leading to "tumultuous assemblies, massacres, banishments."

Throughout history, demagogues have existed. For example, the Athenian leader Cleon is known as a notorious demagogue mainly because of events described in the writings of Thucydides and Aristophanes. For example, after the failed revolt by the city of Mytilene, Cleon persuaded the Athenians to slaughter not just the Mytilenean prisoners, but every man in the city, and to sell their wives and children as slaves. The Athenians rescinded the resolution the following day when they came to their senses. Another example is when Athens had completely defeated the Peloponnesian fleet and Sparta begged for peace on almost any terms, Cleon persuaded the Athenians to reject the peace offer. He also taunted the Athenian generals over their failure to bring the war in Sphacteria to a rapid close, accusing them of cowardice, and declared that he could finish the job himself in twenty days, despite having no military knowledge. They gave him the job, expecting him to fail. Cleon shrank at being called to make good on his boast, and tried to get out of it, but he was forced to take the command. In fact, he succeeded, but only by getting the general Demosthenes to lead the armies into battle, now treating him with respect after previously slandering him behind his back.

It is now thought that Thucydides and Aristophanes exaggerated the vileness of Cleon's character. Both had personal conflicts with Cleon and Cleon was a tradesman, so Thucydides and Aristophanes, who came from the upper classes, looked down on him. Nevertheless, their portrayals of Cleon define the archetypal example of the "low-born demagogue": lower class, hating the nobility, uneducated, despising thought and deliberation, ruthless and unprincipled, bullying, coarse and vulgar in style, rising in popularity by exploiting a national crisis, telling lies to whip up emotions and drive a mob against an opponent, deriving political support primarily from the poor and ignorant, quick to accuse any opponent of weakness or disloyalty, eager for war and violence, inciting the people to terrible acts of destruction they later regret.

Alcibiades is another example of a demagogue. He convinced the people of Athens to attempt to conquer Sicily during the Peloponnesian War, with disastrous results. He convinced the Athenian assembly to make him commander by claiming victory would come easily, appealing to Athenian vanity, and pronouncing action and courage over deliberation.

A third example of a demagogue from history is Gaius Flaminius Nepos, a Roman consul most known for being defeated by Hannibal at the Battle of Lake Trasimene during the second Punic war. Gaius Faminius understood his opponent well, yet made poor decisions, resulting in the loss of 15,000 Roman lives (including his own). Gaius Flaminius was described as a demagogue by Polybius, in his book the Rise of the Roman Empire, saying "Flaminius possesed a rare talent for the arts of demagogy" 

And finally, of course, a recent example is Joseph McCarthy, a US Senator from 1947 to 1957.  Though a poor speaker, McCarthy rose to national prominence during the early 1950s by proclaiming that high places in the United States federal government and military were "infested" with communists, contributing to the second "Red Scare". Ultimately his inability to provide proof for his claims led him to be censured by the United States Senate in 1954, and to fall from popularity. Have a look at my last blog post if you're interested in learning more about McCarthyism!

source: http://en.wikipedia.org/wiki/Demagogue

Thursday, 6 November 2014

McCarthyism

McCarthyism is the practice of making unfair allegations and accusations of, for example, disloyalty and treason without proper evidence and using unfair investigation techniques. 

Senator Joseph McCarthy
The term has its origins in the US during Second Red Scare, (1950 to 1956), which was characterised by heightened political repression against communists, as well as a campaign spreading fear of their influence on American institutions. The term was originally coined to criticize the anti-communist actions of Republican U.S. Senator Joseph McCarthy. However "McCarthyism" soon took on a broader meaning to describe reckless, unsubstantiated accusations, as well as demagogic attacks (when a political leader in a democracy appeals to the emotions, fears, prejudices, and ignorance of the lower classes in order to gain power and promote political motives). McCarthy is often seen as a demagogue: they usually oppose deliberation and advocate immediate, violent action to address a national crisis, accusing moderate and thoughtful opponents of being weak. Demagogues exploit a fundamental weakness in democracy: because ultimate power is held by the people, nothing stops the people from giving that power to someone who appeals to the lowest common denominator of a large segment of the population.

During the McCarthy era, thousands of Americans were accused of being communists or communist sympathizers and became the subject of aggressive investigations and questioning before government or private-industry panels, committees and agencies. The primary targets of such suspicions were government employees, those in the entertainment industry, educators and union activists. Accusations were often made despite inconclusive or questionable evidence, and the level of threat posed by a person's real or supposed leftist associations or beliefs was often greatly exaggerated. Many people suffered loss of employment and/or destruction of their careers; some even suffered imprisonment. Most of these punishments came about through trial verdicts later overturned, laws that were later declared unconstitutional and dismissals that were later declared illegal.

The most famous examples of McCarthyism include the speeches, investigations, and hearings of Senator McCarthy himself and the FBI. McCarthyism was a widespread social and cultural phenomenon that affected all levels of society and was the source of a great deal of debate and conflict in the United States.

However the historical period that came to be known as the McCarthy era began well before Joseph McCarthy's own involvement in it. Many factors contributed to McCarthyism, some of them extending back to the years of the First Red Scare (1917–20), when Communism emerged as a recognised political force. Thanks in part to its success in organizing labour unions and its early opposition to fascism, the Communist Party of the United States (CPUSA) increased its membership through the 1930s, reaching a peak of about 75,000 members in 1940–41. While the United States was engaged in World War II and allied with the Soviet Union, the issue of anti-communism was largely muted. With the end of World War II, the Cold War began almost immediately, as the Soviet Union installed Communist puppet régimes across Central and Eastern Europe, while the United States backed anti-communist forces in Greece and China.

Events in 1949 and 1950 sharply increased the sense of threat from Communism in the United States. The Soviet Union tested an atomic bomb in 1949, earlier than many analysts had expected. That same year, Mao Zedong's Communist army gained control of mainland China despite heavy American financial support of the opposing Kuomintang. In 1950, the Korean War began, pitting U.S., U.N., and South Korean forces against Communists from North Korea and China.

There were also more subtle forces encouraging the rise of McCarthyism. For example, conservative politicians in the US often referred to progressive reforms such as child labour laws and women's suffrage as "Communist" or "Red plots." This increased in the 1930s in reaction to the New Deal policies of President Franklin D. Roosevelt. Many conservatives equated the New Deal with socialism or Communism, and saw its policies as evidence that the government had been heavily influenced by Communist policy-makers in the Roosevelt administration.

Joseph McCarthy's involvement with the ongoing cultural phenomenon that would bear his name began with a speech he made on Lincoln Day, February 9, 1950, to a Republican Women's Club. He produced a piece of paper which he claimed contained a list of known Communists working for the State Department, saying "I have here in my hand a list of 205—a list of names that were made known to the Secretary of State as being members of the Communist Party and who nevertheless are still working and shaping policy in the State Department." This speech resulted in a flood of press attention to McCarthy and established the path that made him one of the most recognised politicians in the US.

The first recorded use of the term McCarthyism was in a political cartoon by Washington Post editorial cartoonist Herbert Block, published on March 29, 1950. The cartoon depicted four leading Republicans trying to push an elephant (the traditional symbol of the Republican Party) to stand on a teetering stack of ten tar buckets, the topmost of which was labeled "McCarthyism". Block later wrote that there was "nothing particularly ingenious about the term, which is simply used to represent a national affliction that can hardly be described in any other way. If anyone has a prior claim on it, he's welcome to the word and to the junior senator from Wisconsin along with it. I will also throw in a set of free dishes and a case of soap.”

Since the time of McCarthy, the word McCarthyism has entered American speech as a general term for a variety of practices: aggressively questioning a person's patriotism, making poorly supported accusations, using accusations of disloyalty to pressure a person to adhere to conformist politics or to discredit an opponent, subverting civil rights in the name of national security, and the use of demagoguery are all often referred to as McCarthyism. McCarthyism can also be synonymous with the term witch-hunt, both referring to mass hysteria and moral panic.

The 1952 Arthur Miller play The Crucible used the Salem witch trials as a metaphor for McCarthyism, suggesting that the process of McCarthyism-style persecution can occur at any time or place. The play focused on the fact that once accused, a person had little chance of exoneration, given the irrational and circular reasoning of both the courts and the public. Miller later wrote: "The more I read into the Salem panic, the more it touched off corresponding images of common experiences in the fifties."

source: http://en.wikipedia.org/wiki/McCarthyism

Tuesday, 28 October 2014

Is Historical Fiction More Truthful Than Historical Fact?


Trying to discover historical facts is a very valid way to study history. For example, asking questions such as 'what did medieval peasants eat?', 'how many people lived in Doncaster in the 1300s?', 'at what speed did plague spread?' are all very useful to historians. It is counter factual history that is often more controversial, and seen as more 'fictional'. For example, 'what would have happened if Constantine hadn't converted Roman Empire to Christianity?' and other questions are often the subject of much dispute.

I recently read an article by Lisa Jardine where she argued that fiction has the power to fill in the imaginative gaps left by history. She spent some time researching the lives of a group of scientists who worked on the development of the atomic bomb during World War Two. Although there are several impeccably researched non-fiction works on the subject and a number of biographies, none of these really conveyed the emotions and convictions that drove their work, and Jardine said she struggled to connect with the personal principles of the scientists who collaborated with such energy to produce the period's ultimate weapon of mass destruction. This led her to turn from fact to fiction in order to understand the motivation of those who joined the race to produce the bomb whose use at Hiroshima and Nagasaki appalled the world.

This same line is followed by many historians; if historical fact cannot fill the gaps in our knowledge, perhaps historians less constrained by documented evidence might do so. Macaulay once remarked that 'the novel is to history what the painted landscape is to a map.' And this is often the case: historical fiction can often breathe life into the lungs of the departed, open one's eyes to new possibilities and explain subtle mysteries.

Jardine says she read Michael Frayn's play, Copenhagen which recreates a famous moment in the history of the race to develop the atomic bomb - a meeting in 1941, in Denmark, between German physicist Werner Heisenberg - best known for his "uncertainty principle", that the more accurately you know the position of a particle, the less accurately you know its velocity, and vice versa - and his former doctoral supervisor and mentor, the Danish quantum theorist Niels Bohr. We know very little about the meeting: just that the two men met and that it was cut short by an angry Bohr. Heisenberg was about to become head of the Nazi atomic bomb project, Bohr was Jewish, an enemy, and under surveillance in an occupied territory. He eventually fled via Sweden and Britain to the United States in 1943, where he played a significant role in the Manhattan Project.

In Frayn's version the two men relive their encounter, struggling unsuccessfully to reach any sort of agreement as to what took place, which would help decide the moral accountability of both men's key role in unleashing nuclear weapons on the world seems somehow to depend on that conversation. As in life, no agreement is reached (after Bohr's death a letter to Heisenberg was found among his papers, vigorously denying Heisenberg's published account of what took place). However Michael Frayn has dramatic licence to invent exchanges and even physical details which focus attention on the issues at stake here. He creates dialogue imaginatively that draws the audience into the debate, prompting it to make its own attempt at assessing the motives and beliefs of the two scientists. Frayn says "The great challenge is to get inside people's heads, to stand where they stood and see the world as they saw it, to make some informed estimate of their motives and intentions. The only way into the protagonists' heads is through the imagination."

Jardine recounts how in 1999 she took her mother, then in her early 80s, to see Frayn's Copenhage. She had met the Bohrs, and they had moved in similar circles after the war. During the interval she explained to Jardine which parts of the dialogue had sounded implausible to her, and where there had been small inaccuracies. As the curtain came down on the final act, and the lights came up, Jardine turned to her mother to ask her how she now felt about the play. She was sitting, hands folded in her lap, tears coursing down her face. Only later did she tell Jardine it was like being there all over again.

Jardine has concluded that often fiction is needed to sharpen our senses, to focus our attention sympathetically, in order to give us emotional access to the past and that the creative imagination of fiction writers can reconnect us with the historical feelings, as well as the facts.

However I do not agree with Jardine's suggestion that fiction is more 'truthful' than fact as fiction is based upon the author's sensibilities and what the target audience is looking for. Jardine argues that some works of historical fiction are "successful attempts at a kind of bridging between fact and fiction that captures the feelings behind the ideas", however this is unconvincing. Heisenberg and Bohr may have thought and felt as Frayn portrays them - or they may not. Fiction involving real people can mislead people into confusing reality with an author's imagination, even if fiction can be a good way of exploring issues surrounding real people and events and is often a great vehicle to learn about major world events. Yet fiction takes great liberties with the truth. For example Frayn falsely represented Werner Heisenberg, making him morally ambiguous for the premise of the play to be possible. Therefore, I do not think that a work of fiction, which never claims to accurately reproduce reality, can give you more insight than the actual thoughts and papers and letters of those who lived through that time and discussed it.

source: http://www.bbc.co.uk/news/magazine-29060077

Wednesday, 22 October 2014

When Does 'Modern History' Start?

A few weeks ago a New York Times article announced the possible “first peaceful transition of power in Iraq’s modern history” after Nouri al-Maliki’s announced that he would be stepping down as prime minister and transferring power to Haider al-Abadi, also of the al-Dawa party.

That made me think about when history starts being modern. Iraq’s recent monarchies were full of peaceful transfers of power. From the 1920s until 1958, when the monarchy of King Faisal II was overthrown, it was common for Iraq’s parliament to transfer power without incident. There were elections, a parliament, a relatively open press, freedom of assembly, freedom of religion, opposition parties and opposition figures. Whilst power wasn’t entirely in the hands of parliament at the time because there was a monarchy and the British still had influence, does that period not count as peaceful transfers in the modern history of Iraq?

The phrase “in modern history” isn’t one that journalists only apply to Iraq, of course. However there is no universally agreed upon periodicisation for history. The definition of modern history depends heavily on the country or region in discussion. For example Iran’s 'early modern' era started in about 1500, whilst ‘modern European history’ beings in the 19th century-20th century (and up to the present).

I was recently having a discussion with a lawyer who told me that he thought 'history' begins when all the main protagonists are dead (making WWI 'history' but not the Cold War). In my opinion this is not a useful definition as it does not make any distinction between 'history' and 'modern history'. I also don't think that modern history is simply politics; I believe they are two very different ways of looking at past events.

In my opinion, when a historian declares something is the 'best' or most 'violent' (for example) thing in modern history, it's important to think about how far 'modern history' goes back in that particular country/area in order to see if the claim is valid, as modern history is a flexible term that means different things in different circumstances.

Thursday, 16 October 2014

Does It Help to Know History?

The point of studying history is not because it gives students with history degrees a practical advantage on non-history students, but because it enables us to enter into a long existing, ongoing conversation. It isn’t productive in a tangible sense; it’s productive in a human sense. The action, whether rewarded or not, really is its own reward.

Every writer, of every political bias, has a neat historical analogy, or mini-lesson, with which to preface an argument for why we ought to bomb one organisation or side with another against a state we were bombing before. But the best argument for reading history is not that it will show us the right thing to do in one case or the other, but rather that it will show us why even doing the right thing rarely works out. The advantage of having a historical sense is not that it will lead you to instructions on what to do or not to do, but that it will teach you that no such answers exist. What history generally “teaches” is how hard it is for anyone to control it, including the people who think they’re making it.

Roger Cohen, for example, wrote about all the mistakes that he thought the United States has made in the Middle East over the past decade, with the implicit notion that there are two histories: one recent, in which everything that the United States has done has been ill-timed and disastrous; and then some other, superior, alternate history, in which imperial Western powers intervened in the region, wisely picking the right sides and thoughtful leaders, promoting militants without aiding fanaticism, and generally aiding the cause of peace and prosperity. As the Libyan intervention demonstrates, the best will in the world combined with the best candidates can't broken polities quickly. What “history” shows is that the same forces that led to the Mahdi’s rebellion in Sudan more than a century ago—rage at the presence of a colonial master; a mad turn towards an imaginary past as a means to equal the score—keep coming back and remain just as resistant to management, close up or at a distance, as they did before. ISIS is a horrible group doing horrible things, and there are many factors behind its rise. But they came to be a threat and a power less because of all we didn’t do than because of certain things we did do—foremost among them that massive, forward intervention, the Iraq War. (The historical question to which ISIS is the answer is: What could possibly be worse than Saddam Hussein?)

Another, domestic example of historical blindness is seen in Lyndon B. Johnson who was a ruthless political operator who passed crucial bills such as the Civil Rights Act. He also engineered pushed the Vietnam War through Congress, a moral and strategic catastrophe that ripped the United States apart and, inflicted terror on the Vietnamese. It also led American soldiers to commit atrocious war crimes, almost all left unpunished. Johnson did many good things, but to use him as a positive counterexample of leadership to Barack Obama or anyone else is marginally insane.

Johnson’s tragedy was critically tied to the cult of action, of being tough and not just sitting there and watching. But not doing things too disastrously is not some minimal achievement; it is a maximal achievement, rarely managed. Studying history doesn’t argue for nothing-ism, but it makes a very good case for minimalism: for doing the least violent thing possible that might help prevent more violence from happening.

The real sin that the absence of a historical sense encourages is presentism, in the sense of exaggerating our present problems out of all proportion to those that have previously existed. It lies in believing that things are much worse than they have ever been—and, thus, than they really are—or are uniquely threatening rather than familiarly difficult. Every episode becomes an epidemic, every image is turned into a permanent injury, and each crisis is a historical crisis in need of urgent aggressive handling—even if all experience shows that aggressive handling of such situations has in the past, quite often made things worse. (The history of medicine is that no matter how many interventions are badly made, the experts who intervene make more: the sixteenth-century doctors who bled and cupped their patients and watched them die just bled and cupped others more.) What history actually shows is that nothing works out as planned, and that everything has unintentional consequences. History doesn’t show that we should never go to war—sometimes there’s no better alternative. But it does show that the results are entirely uncontrollable, and that we are far more likely to be made by history than to make it. History is past, and singular, and the same year never comes round twice.

Those of us who obsess, for instance, particularly in this centennial year, on the tragedy of August, 1914—on how an optimistic and largely prosperous civilization could commit suicide—don’t believe that the trouble then was that nobody read history. The trouble was that they were reading the wrong history, a make-believe history of grand designs and chess-master-like wisdom. History, well read, is simply humility well told, in many manners. And a few sessions of humility can often prevent a series of humiliations. What should, say, the advisers to Lord Grey, the British foreign secretary, have told him a century ago? Surely something like: Let’s not lose our heads; the Germans are a growing power who can be accommodated without losing anything essential to our well-being and, perhaps, shaping their direction; Serbian nationalism is an incident, not a cause de guerre; the French are understandably determined to take back Alsace-Lorraine, but this is not terribly important to us—nor to them either, really, if they could be made to see that. And the Ottoman Empire is far from the worst arrangement of things that can be imagined in that part of the world. We will not lose our credibility by failing to sacrifice a generation of our young men. Our credibility lies, exactly, in their continued happy existence.

Many measly compromises would have had to be made by the British; many challenges postponed; many opportunities for aggressive, forward action shirked—and the catastrophe, which set the stage and shaped the characters for the next war, would have been avoided. That is historical wisdom, the only wisdom history supplies. The most tempting lesson that history gives is to not tempt it. Those who simply repeat history are condemned to leave the rest of us to read all about that repetition in the news every morning

source: http://www.newyorker.com/news/daily-comment/help-know-history?utm_source=tny&utm_campaign=generalsocial&utm_medium=twitter&mbid=social_twitter

Sunday, 12 October 2014

Easter Island


One of the most fascinating things about Easter Island, for me, is how the obsessive building of stone statues actually led to a society that was so unsustainable (they focused a disproportionate amount of their resources on them) that the native inhabitants of Easter Island were almost entirely wiped out.

Easter Island was, for most of its history, one of the most isolated, territories on earth. Its inhabitants, the Rapa Nui, have endured famines, epidemics of disease and cannibalism, civil war, slave raids and have seen their population crash on more than one occasion.

The Austronesian Polynesians who first settled the island are likely to have arrived from the Marquesas Islands from the west, and arrived there as recently as 1200 CE. These settlers brought bananas, taro, sugarcane, and paper mulberry, as well as chickens and Polynesian rats. It is suggested that the reason settlers sought an isolated island was because of high levels of Ciguatera fish poisoning in their then current surrounding area.

Some contact between the the culture of Easter Islanders and South Americans is shown by the dispersion of the sweet potato: this staple of the pre-contact Polynesian diet is of South American origin, and there is no evidence that its seed could spread by floating across the ocean. Either Polynesians traveled to South America and back, or South American balsa rafts drifted to Polynesia, possibly unable to make a return trip because of their less developed navigational skills and more fragile boats. Jacob Roggeveen's expedition of 1722 was the first-recorded European contact with the island. He reported "remarkable, tall, stone figures, a good 30 feet in height" and described the islanders, saying they were "of all shades of colour, yellow, white and brown" and they distended their ear lobes so greatly with large disks that when they took them out they could "hitch the rim of the lobe over the top of the ear". Roggeveen also noted how some of the islanders were "generally large in stature". Islanders' tallness was also witnessed by the Spanish who visited the island in 1770, measuring heights of 196 and 199 cm.

According to legends recorded by the missionaries in the 1860s, the island originally had a very clear class system, with an ariki, king, wielding absolute god-like power ever since Hotu Matua had arrived on the island. The most visible element in the culture was production of massive moai that were part of the ancestral worship. With a strictly unified appearance, moai were erected along most of the coastline, indicating a homogeneous culture and centralized governance. In addition to the royal family, the island's habitation consisted of priests, soldiers and commoners. The last king, along with his family, died as a slave in the 1860s in the Peruvian mines. Long before that, the king had become a mere symbolic figure, remaining respected and untouchable, but having only nominal authority.

Several foreign vessels approached Easter Island during the early 19th and 18th centuries, but the islanders had become openly hostile towards any attempt to land, and very little new information emerged before the 1860s. In December 1862, Peruvian slave raiders struck Easter Island. Violent abductions continued for several months, eventually capturing or killing around 1500 men and women, about half of the island's population. International protests erupted and the slaves were finally freed in autumn, 1863, but by then most of them had already died of tuberculosis, smallpox and dysentery. Finally, a dozen islanders managed to return from the horrors of Peru, but brought with them smallpox and started an epidemic, which reduced the island's population to the point where some of the dead were not even buried. Contributing to the chaos were violent clan wars with the remaining people fighting over the newly available lands of the deceased, bringing further famine and death among the dwindling population.

Since being given Chilean citizenship in 1966, the Rapa Nui have re-embraced their ancient culture, or what could be reconstructed of it, but since 97% of the population were killed in a few decades, much of the culture and heritage has been lost.

Mataveri International Airport is the island's only airport. In the 1980s, its runway was lengthened by the U.S. space program to 3,318m so that it could serve as an emergency landing site for the space shuttle. This enabled regular wide body jet services and a consequent increase of tourism on the island, coupled with migration of people from mainland Chile which threatens to alter the Polynesian identity of the island. Land disputes have created political tensions since the 1980s, with part of the native Rapa Nui opposed to private property and in favor of traditional communal property.

On 30 July 2007, a constitutional reform gave Easter Island the status of special territories of Chile. Pending the enactment of a special charter, the island will continue to be governed as a province of the Valparaíso Region.

Friday, 10 October 2014

British Imperialism

One of my readers sent me this picture today, in response to the picture I posted earlier this week. Britain has invaded over 90% the world's countries, quite an impressive feat for a small island off mainland Europe! The map below shows every country that Britain has ever invaded.

Tuesday, 7 October 2014

Independence from the UK

I really like this picture that I found online last week, showing every country that has ever gained independence from the UK, and when.
Embedded image permalink

Friday, 3 October 2014

Is History a Science?

A highly debated question over the past few decades has been whether or not the study of history can be counted as a science or not. A lot of the books I have read recently have discussed this idea, and I have decided to collect a few of the notes I have made and ideas I have thought of whilst reading together in one (rather jumbled) blog post.

Trevelyan has said history is a mixture of the scientific (research), the imaginative or speculative (interpretation) and the literary (presentation). Historians' facts will always be incomplete; no one is ever going to unravel scientifically the metal processes of twenty million Frenchmen during the French Revolution of 1789. But interpretations of this event cannot be arrived at by a mere process of induction.

Elton thought history was neither an art or a science. 'History', he declared proudly, 'is a study different from any other and governed by rules peculiar to itself'. Elton argued that historical knowledge is cumulative; he said that through the steady accumulation of empirical knowledge, professional historians were advancing 'ever nearer to the fortress of truth', making history somewhat scientific. He also distinguished between a historian and a poet or novelist by arguing that historians have to be trained; i.e. they have to learn technical details of the documents they use. However Zeldin argued that the history one writes is an expression of their individuality, and that one cannot be trained or taught how to write history.

Carr said there is no clear separation between the researcher and the object of research. However he did say that historians should not judge the past in moral terms; their purpose was rather to understand how the past had contributed to human progress. It is pointless, for example, to condemn slavery in the ancient world as immoral; the point is to understand how it came about, how it functioned, and why it declined, opening the way to another form of social organisation. There is also, surely, a moral element in research in the natural sciences; moral concerns may drive scientific research, or they may emerge from it.

Carr also argued that whilst no two historical events are identical, no two atoms are either, no two stars, no two living organisms, yet this does not stop scientists from framing their laws. Similarly, he said 'the very use of language commits the historian, like the scientist, to generalisation'. He pointed out that history is not merely confined to the establishment of particular, isolated facts: it teaches lessons, such as the constant reference made by delegates at the Versailles Conference in 1919 to the Vienna Congress of 1815. Carr also said history can still be a science even though it cannot predict the future. The law of gravity, he observed, could not predict that a particular apple would fall as a particular time in a particular place. Scientific laws operated only under certain specific conditions, and history seems to be able to predict events too in some ways, such as the necessary conditions and time and place where a revolution might take place - thus history was just like any other science in its generation of laws and predictive capacities.

One reason to suppose that history is not a science is that historians' conclusions cannot be experimentally replicated. But a number of important aspects of sciences are based, like history, on observation rather than experimentation, such as, for example, astronomy. Such observation may not be directly visual; it may be indirect, for instance, as when the existence of Pluto was postulated through the observation of irregularities in the orbital path of the next-outermost planet in the solar system. Likewise we infer the existence of some objects in outer space from radio waves, but we cannot recreate them in experiments, like we can in a chemical reaction, Yet nobody has claimed that astronomy is not a science.

Froude claimed that history had to confine itself to a presentation of facts, and not be used to 'spell out' theories of a 'scientific nature'. Indeed, in every case where historians have tried to prove generally applicable laws of history, they have been easily refuted by critics who have demonstrated instances where they do not apply. For example in A Study of History Toynbee tried to draw a series of general laws as to why civilisations rose, developed and collapsed. Yet this has later been criticised as some historians have suggested Toynbee simply selected the evidence he wanted. Buckle suggested that history could be scientifically reduced to a series of mathematical formulae.

While many people, especially politicians, try to learn lessons from history, history itself shows that in retrospect very few of these lessons have been the right ones. Time and time again, history has proved a very bad predictor of future events. This is because history never repeats itself; nothing in human society, the main concern of the historian, ever happens twice under exactly the same conditions or in exactly the same way. And when people try to use history they often do so not in order to accommodate themselves to the inevitable, but in order to avoid it. For example, British politicians in the post-war era based much of their conduct of foreign policy on the belief that the 'appeasement' of dictators, such as was carried out by Prime Minister Neville Chamberlain in his relations to Hitler in 1938-39, could only lead to disaster. In practice however, this very belief only too often led to disaster itself, most notoriously in 1956, when the British Prime Minister Sir Anthony Eden, obsessed with the desire to avoid the 'appeasement' of dictators which he himself had criticised in the later 1930s, launched an ill-advised and unsuccessful military strike against the Egyptian government under Colonel Nasser when it nationalised the Anglo-French owned Suez Canal.

Nor does history enable one to predict revolutions, however they are defined. Historians notoriously failed to predict, for instance, the fall of the Berlin Wall and the collapse of the Soviet Union in 1989-91. And although Carr argued that the historian's role was to use an understanding of the past in order to gain control of the present, very few historians have shared this concept of using the past as the basis for concrete predictions. Whilst a chemist knows, in advance, the result of mixing two chemicals together in a lab, a historian has no such advance knowledge.

History, however, can produce generalisations. It can identify, or posit with a high degree of plausibility, patterns, trends and structures in the human past. In these respects it can legitimately be regarded as scientific. But history cannot create laws with predictive power. An understanding of the past might help in the present as it can broaden our knowledge of human nature, provide us with inspiration or warning and suggest plausible (although fallible) arguments about the likely possibilities of things happening under certain conditions. None of this, however, comes anywhere near the immutable predictive certainty of a scientific laws. Marx, Engels, Toynbee and Buckle all claimed to have discovered laws in history; all were wrong.

Paul Kennedy's The Rise and Fall of the Great Powers, argued that there is a pattern in history where wealthy states created empires, but eventually overstretched their resources and decline. The book used the failure of the Habsburg Empire to achieve European domination in the 16th and 17th centuries as a prediction that the United States would be unable to sustain its global hegemony far into the 21st century. Written in 1987, the book also argued that the Soviet Union was not close to collapse. Within a few years the Soviet Union had indeed collapsed and the world hegemony of the United States appeared more assured than ever. Even in the economic boom of the 1990s the US showed few signs of suffering from the 'imperial overstretch' which Kennedy had prophised. As soon as Kennedy tried to turn his generalisations into laws, he ran into trouble. It is always a mistake for a historian to try to predict the future.

So, history is a science in the weak sense of the word, but it is also an art. Marc Bloch claimed it is also a craft because historians learn on the job how to handle their materials and wield the tools of their trade. History is a varied and protean discipline, and historians approach what they do in different ways That means history can be both scientific, linguistic, literary and a craft.

Monday, 22 September 2014

Ice Cream and War

I recently read two very interesting (and very different articles) that made me think about how preconceptions influence historical writing.

The first article was a review of Niall Ferguson's book, The Pity of War by Jay Winter. Winter argued that Ferguson brought his politics to the study of war, using counter-factual history to support his arguments. For example, Ferguson argued that had the British not entered WWI, there is 'no question' that the Germans would have won the war. Ferguson also claimed that WWI was 'the greatest error of modern history', because its ultimate aim, to end German dominance of Europe, has failed anyway, as Germany now dominates Europe (although economically, not militarily). Ferguson also claimed that the British sacrificed their empire status thanks to the war, and that Britain squandered her assets and manpower on an avoidable and pointless conflict that ultimately led to the decline of the British Empire. Winter disagrees with Ferguson, arguing that there are so many variables that make up history that it is impossible to specify which exact variable caused a consequence, and for example, what exact variable caused the Germans to lose WWI (British entry, the strength of their army, their military planning etc.).

The second article, Making and Eating Ice Cream in Naples: Rethinking Consumption and Sociability in the Eighteenth Century by Melissa Calaresu shows how northern European historians have mistakenly assumed that ice cream must have been a food for the rich because, firstly, they were unaware of the snow trade and secondly, because sources tend to be biased towards the rich (for example cook books for stewards of large houses). This means that historians tend to only have sources of ice cream being served at aristocratic banquets in Naples, and have few sources of it being eaten in coffee and ice cream shops and being sold on the street and made in modest households. Because northern European historians saw how expensive ice was in northern Europe at the time, they wrongly assumed that the same was in Naples, meaning they assumed that ice cream was an expensive treat. In reality there was a snow trade in Naples that meant anyone, from a child on the street to a rich aristocrat, could buy it as long as they had a few coins to spare.

What the two articles show is how historians can often be mistakenly blind sided by preconceptions when looking at evidence, and how important it is for a historian to look at sources from a totally fresh point of view, forgetting the assumptions and prior knowledge they have (or think they have), which may influence their interpretation of sources.

Sunday, 21 September 2014

The Role of Chance in History

I have just finished reading Samuel Pepys: The Unequalled Self by Claire Tomalin, which was a hilarious and eye-opening book into the late 17th century. One of the things I found most interesting about the book was, however, how Tomalin described the discovery of the Diaries.

Pepys' Diaries ended up in the Bodleian Library in Oxford. In 1812 a Scottish historian, David Macpherson, included a few words from the Diary to illustrate the growth of the tea trade in Europe - Pepys had his first cup of tea and recorded the fact on September 1660, writing the words 'Cupp' and 'Tee' in longhand. Pepys actually wrote the first mention we have in English of anyone drinking a cup of tea.

What is most fascinating about this is how Macpherson found this one reference among six volumes of the Diary, how Macpherson managed to understand the Diary (it was written in code), and how Macpherson came to know to look at the Diary for information on the tea trade in the first place (it was relatively unknown and Pepys was far from a household name at the time).

His book, History of the European Commerce with India was noticed by chance by an Oxford scholar, George Neville, who was intrigued to know what else was in the Diaries - Pepys had after all lived through momentous times - the restoration of the monarchy, the last great plague epidemic, the Great Fire of London of 1666. He set an under-graduate on the task of breaking the code and transcribing the Diary. John Smith spent three years on this task, from the spring of 1819 until 6 April 1822, when he completed the transcription of Pepys' 3,102 pages on to 9,325 of his own, filling 54 notebooks.

Had Pepys not had that cup of tea, Macpherson not found it and wrote about it in his book, Neville been less curious about what else the contents might have held and Smith less intelligent, then the name Pepys might only be known to naval historians, and a very considerable part of what we know about how people lived in the second half of the 17th century would be unknown.

So it is very lucky he had that cup of tea.

Saturday, 20 September 2014

1919 And All That

The first scheduled daily international airline service in the world left Hounslow Heath (the first international London airport) on 25 August 1919.

The Heath was a large flat area south-east of today's Heathrow Airport. It was initially very popular with airlines because it was one of the first airfields in the country with a full customs facilities. Originally a cavalry training ground, the airfield and the surrounding heath land was owned by the Army, and tensions between it and the civil airlines meant a new airport was rapidly developed at Croydon. Hounslow Heath closed as an airfield in March 1920.

The first flight that took off was a de Havilland DH.4a, with a single pilot and seats (in an enclosed cabin) for two passengers. Flight time to Paris was around two and a half hours. The pilot was Lt EH Lawford, and behind, with a load of luggage (including grouse, clotted cream, leather and newspapers) was a journalist from the London Evening Standard, George Stevenson-Reece. Other journalists followed on a later flight, which they then wrote up as having been the first.

Thursday, 18 September 2014

A Brief History Of The Union

Today 4 million Scottish voters are expected to turn out at over 2,000 polling stations all over Scotland and vote on whether or not they want to make Scotland an independent country. Whilst we won't know the result until tomorrow, I have decided to do a short blog on the history of the union.

For centuries the two countries existed separately with two crowns. English attempts to invade in the 13th and 14th centuries failed, but in 1603 King James VI of Scotland inherited the English throne after his cousin Elizabeth I died childless, effectively uniting the crowns.

In 1707 the Act of Union was signed by the English and Scottish parliaments, creating the United Kingdom of Great Britain. Thousands of ordinary Scottish people revolted at what they saw as a 'takeover' rather than a 'merger'. The fact that the Act was signed by Scottish nobles who benefited financially from the union only increased hatred for it.

In 1715 the first Jacobite uprising took place. British forces crushed an attempt by Scottish supporters of the exiled House of Stuart to regain the throne. Thirty years later the second Jacobite uprising took place, led by Price Charlie. His army seized Edinburgh but was defeated at the Battle of Culloden. The Jacobites were rounded up, imprisoned or executed. Shortly after, the wearing of the kilt was banned.

The 1760s saw some of the most influential figures of the Enlightenment emerge such as Adam Smith, David Hume and James Hutton.

In 1885 a Scottish secretary was established after Scottish MPs lobbied for one to prime minister William Gladstone. In 1886 the Scottish Home Rule Association was formed. In 1900 the Young Scots Society, a Liberal group committed to Scottish home rule was created, and by 1914 they had 10,000 members and 50 branches. A Home Rule Bill introduced in 1913 by William Cowan was stymied by WWI.

In 1934 the Scottish National Party was formed in Glasgow, winning its first seat in the House of Commons in 1945, before losing it three months later. In 1967 SNP win their next seat in the House of Commons in a Hamilton by-election.

In 1973 the Kilbrandon Commission recommended devolved assemblies for Scotland and Wales after a four-year inquiry. In 1979 a referendum on Scottish devolution ended with a 'no' vote as it failed to reach the 40% threshold needed. In 1999 elections were held for the first 129-member Scottish Parliament; Labour won 56 seats and the SNP 35 seats. In 2004 the new Scottish Parliament was opened by the Queen in Holyrood, costing £400 million.

In 2011 the SNP, led by Alex Salmond, won a majority in the Scottish elections. Salmond promised a referendum on Scottish independence, after the subject of independence was brought to the surface in 2007 by the Scottish government with hundreds of meetings between the public and ministers.

 In October 2012 Salmond and Cameron signed the Edinburgh Agreement, paving the way for a Scottish independence referendum. In November 2013 the Scottish government began its campaigning by publishing the paper 'Scotland's Future', making the case for independence. Salmond also changed the law to allow everyone over the aged of 16, rather than 18, to vote.

In August 2014 Salmond and Alistair Darling (leader of the Better Together campaign), debated on television, clashing over oil revenues, currency and the future of nuclear weapons. A poll suggested Salmond got 71% versus Darling's 29%. Today is the day of the referendum, and if there is a yes vote, March 24 2016 is suggested as being the date for Scottish independence.

Saturday, 13 September 2014

The Loch Ness Monster



Last week a photographer claimed to have taken a photograph of the Loch Ness Monster in the Lake District (left). Her claim inspired me to do some research on the Loch Ness Monster and its history. 

The Loch Ness Monster is a creature that supposedly lives in the Loch Ness in the Scottish Highlands. Whilst its existence has been suggested and 'photographed' many times, it has not been discovered or documented by the scientific community. Popular interest and belief in the animal's existence has varied since it was first brought to the world's attention in 1933.

The most common speculation among believers is that the creature represents a line of long-surviving plesiosaurs. The scientific community regards the Loch Ness Monster as a modern-day myth, and explains sightings as possible misidentifications, hoaxes or wishful thinking. Despite this, it remains one of the most famous examples of cryptozoology. 

On 4 August 1933, the Courier published an article about George Spicer who claimed that  a few weeks earlier while motoring around the Loch, he and his wife had seen "the nearest approach to a dragon or pre-historic animal that I have ever seen in my life", trundling across the road toward the Loch carrying "an animal" in its mouth. This led to a series of people writing into the Courier also claiming to have seen 'Nessie'. These stories soon reached the national (and later the international) press, which described a "monster fish", "sea serpent", or "dragon", eventually settling on "Loch Ness Monster".

On 6 December 1933 the first purported photograph of the monster, taken by Hugh Gray, was published in the Daily Express, and shortly after the creature received official notice when the Secretary of State for Scotland ordered the police to prevent any attacks on it. In 1934 R. T. Gould published a book about his personal investigation and collected records of additional reports pre-dating 1933.

The earliest report of the Loch Ness Monster appears in the Life of St. Columba by Adomnán, written in the 7th century. According to Adomnán, writing about a century after the events he described, the Irish monk Saint Columba was staying in the land of the Picts with his companions when he came across the locals burying a man by the River Ness. They explained that the man had been swimming the river when he was attacked by a "water beast" that had mauled him and dragged him under. They tried to rescue him in a boat, but were able only to drag up his corpse. However, sceptics question the narrative's reliability, noting that water-beast stories were extremely common in medieval saints' Lives; as such, Adomnán's tale is likely to be a recycling of a common motif attached to a local landmark.

In August 1933 a motorcyclist named Arthur Grant claimed to have nearly hit the creature while approaching Abriachan on the north-eastern shore, at about 1 a.m. on a moonlit night.  On 21 April Dr Robert Wilson took a photo of the creature (the Surgeon's Photo), however this was later revealed as a fake. 
Sightings of the monster increased following the building of a road along the loch in early 1933, bringing both workmen and tourists to the formerly isolated area. In 1938, Inverness-shire Chief Constable William Fraser wrote a letter stating that it was beyond doubt the monster existed. In May 1943, C. B. Farrel of the Royal Observer Corps was supposedly distracted from his duties by a Nessie sighting. In December 1954 a strange sonar contact was made by the fishing boat Rival III. The vessel's crew observed sonar readings of a large object keeping pace with the boat for about 800m. In 1963 film of the creature was shot in the loch from a distance of 4 kilometres. Because of the distance at which it was shot, it has been described as poor quality.

In 2003, the BBC sponsored a full search of the Loch using 600 separate sonar beams and satellite tracking. The search had enough resolution to pick up a small buoy. No animal of any substantial size was found whatsoever and despite high hopes, the scientists involved in the expedition admitted that this essentially proved the Loch Ness monster was only a myth.

In 2004, a documentary team for television channel Five, using special effects experts from movies, tried to make people believe there was something in the loch. They constructed an animatronic model of a plesiosaur, and dubbed it "Lucy". Despite setbacks, such as Lucy falling to the bottom of the loch, about 600 sightings were reported in the places they conducted the hoaxes.

In 2005, two students claimed to have found a huge tooth embedded in the body of a deer on the loch shore. They publicised the find widely, even setting up a website, but expert analysis soon revealed that the "tooth" was the antler of a muntjac. The Loch Ness tooth was a publicity stunt to promote a horror novel by Steve Alten titled The Loch.

In 2007, a video purported to show Nessie jumping high into the air showed up on YouTube. This was revealed by the online amateur sceptic's community eSkeptic to be a viral ad promoting the then-upcoming Sony Pictures film The Water Horse. The release of the film confirmed the eSkeptic analysis: the viral video comprises footage from The Water Horse.

Whilst a considerable amount of time and money has gone into searching for the Loch Ness Monster, the failure to prove its existence time and time again shows that 'Nessie' is nothing more than an interesting story tale - as much as we all wish she exists!