mcgillianaire: (Sachin Tendulkar)
[SOURCE]

The Cincinnati Reds beat the Philadelphia Phillies 2-1 on this night in 1935 in Major League Baseball’s first-ever night game, played courtesy of recently installed lights at Crosley Field in Cincinnati.

The first-ever night game in professional baseball took place May 2, 1930, when a Des Moines, Iowa, team hosted Wichita for a Western League game. The game drew 12,000 people at a time when Des Moines was averaging just 600 fans per game. Evening games soon became popular in the minors: As minor league ball clubs were routinely folding in the midst of the Great Depression, adaptable owners found the innovation a key to staying in business. The major leagues, though, took five years to catch up to their small-town counterparts.

The first big league night game on this day in 1935 drew 25,000 fans, who stood by as President Roosevelt symbolically switched on the lights from Washington, D.C. To capitalize on their new evening fan base, the Reds played a night game that year against every National League team–eight games in total–and despite their lousy record of 68-85, paid attendance rose 117 percent.

Though baseball owners had a well-deserved reputation for being old-fashioned, most teams soon followed suit, as they knew night games would benefit their bottom line. Teams upgraded their facilities to include lights throughout the 1930s and 40s, and before long, most of the league had night games on the schedule. Wrigley Field, on Chicago’s North Side–the second oldest major league park after Boston’s Fenway–was the last of the parks to begin hosting night games. Wrigley’s tradition of hosting only day games held for 74 seasons until August 8, 1988, when the Cubs hosted the Philadelphia Phillies. That game was rained out in the third inning, so Wrigley’s first night game is officially recorded as a 6-4 win over the New York Mets on August 9, 1988. The Cubs are the only major league team that still plays the majority of their home games during the day.
mcgillianaire: (Ari G)
Congratulations to Canada for throwing out the Tories and electing a majority Liberal government. About time, I might add. But is it just me who is uneasy with a seemingly growing trend of nepotism at the highest levels of Western democracies?
mcgillianaire: (Scale of Justice)
I've just handed in the following essay as part of my Academic Writing course at Harvard's Extension School. It culminates a six week process of cumulative work, that began with three assigned readings on aspects of education. We had to pick one of the three readings and build our exercises around it. The essay below is my final product. I hope you enjoy it. I think it can form the basis of a longer essay in the future, and there is certainly room for improvement, but I am fairly pleased with the way my writing has developed, particularly in regards to structure and transitions that have been a longstanding weakness of mine. I would love to receive feedback from you guys too! Thanks for taking the time to read it. The essays in the footnotes are worth a read too!



Intelligence does not have to be schooled and education takes multiple forms. What we need as a society is a recalibration of the assumptions we make about knowledge acquisition. In "Blue-Collar Brilliance", Mike Rose, a research professor at the Graduate School of Education and Information Studies at UCLA, challenges the widely-held notion that intelligence can only be measured by the amount of formal education a person has acquired, while offering an alternative viewpoint that emphasizes the extent to which blue and pink-collared workers harness their intellect.1 He is right to challenge the status-quo, but even his well-reasoned argument falls short of extending the thesis to society at large, regardless of whether someone is employed or not. And that is crucial, because for many people intelligence is still a synonym for formal education, and the more letters you have after your name, the more likely you are perceived to be of superior intelligence. By simply extending the cognitive-franchise to blue and pink-collared workers, we ignore and deny the cerebral contributions of millions of stay-at-home parents and other less academically qualified thinkers around the world.

The assumptions we make about knowledge and intelligence acquisition have a direct impact on the way our entire economy is structured. Just look at the salary and wage differentials between those generally classified as white and blue-collared workers. According to Bureau of Labor Statistics data from May 2014, the mean annual wage for lawyers, airline pilots and financial managers was roughly $130,000. But for electricians and plumbers it was $54,000, and $42,000 for truck drivers. While for other blue and pink-collared workers such as janitors, grounds maintenance workers, auto mechanics and waitresses, they earned median hourly wages ranging from just $9.01 per hour to $17.84 per hour.2 Under the present system, wages do not reflect the amount of a worker’s thought and effort. The perfect example at the other extreme is that of a stay-at-home parent who does not earn any wage at all, but you would be hard-pressed to find one who does not stop thinking about their work (i.e. their children/partner) all day long. And despite the wide gap in mean wages between white and blue-collared workers, both groups of people toil a similar number of hours at their respective workplaces. If anything, blue and pink-collared employees work longer hours than their more formally educated counterparts, sometimes fitting in two or more jobs in order to make ends meet. And no doubt their experiences are as taxing on the mind as it is on the body.

But are the best and the brightest truly more intelligent? And have they acquired more knowledge? I do not believe so. Take for instance my octogenarian paternal grandmother. Denied formal schooling beyond fourth grade, she has remained a stay-at-home mom her entire life. Married before sixteen, five kids by her mid-twenties (including losing two in infancy), forced to accommodate six orphaned children from her in-law’s family soon after, and widowed at fifty, she has been compelled by circumstance to constantly adapt to a changing environment. Without a degree in home economics, she learnt how to ration a fixed supply of meager financial and food resources for the enlarged household. Religion helped provide direction in her life and she imparted the wisdom gained from its parables to her children. Even today when I visit her in my father’s hometown in southern India, it never ceases to amaze me how everybody who knows her, irrespective of age, solicits her advice to deal with life. Indeed she is the epitome of someone schooled in life. Despite lacking a formal education, she has cultivated her intelligence by acquiring knowledge through daily experience and put it to use without ever receiving a penny. And yet the society we live in would dismiss her rich contribution to it.

Our assumptions about intelligence, work and social class affect the way we treat even our fellow workers. Consider the example of a nurse in my father’s hospital who assists with surgeries. Various surgeons, including my father work with him, and through many years of experience and observation, the nurse has gained sufficient knowledge to offer useful suggestions to my father during a surgery, particularly in the middle of a tricky procedure or sticky situation. More often than not, the nurse’s insight has proven significant. But when my father recommended the nurse’s input to a fellow surgeon friend, the latter was not immediately convinced. It took several further surgeries before he acknowledged the nurse’s potential and contribution. Had the nurse completed the academic qualification to perform surgeries himself and offer suggestions, there would not have been any hesitation on the other surgeon’s part to accept my father’s advice. Rose posits that “generalizations about intelligence, work, and social class deeply affect our assumptions about ourselves and each other, guiding the ways we use our minds to learn, build knowledge, solve problems, and make our way through the world” and he is absolutely right.

Rose offers several compelling reasons as to why we need to redress the imbalance in the assumptions we make about intelligence and knowledge acquisition, by outlining how blue-collar workers’ “use of tools requires the studied refinement of stance, grip, balance, and fine-motor skills” while specifying how “carpenters have an eye for length, line, and angle; mechanics troubleshoot by listening, [and] hair stylists are attuned to shape, texture, and motion”. It is high time we added to this list the millions of lifelong homemakers who also tap into their intellect on a daily basis by rearing children and keeping families together.

With all this in mind, you may wonder how we might recalibrate the assumptions we make about intelligence and knowledge acquisition? For that we need to identify why it is calibrated wrong in the first place. Rose argues that “our culture – in Cartesian fashion – separates the body from the mind, so that, for example, we assume that the use of a tool does not involve abstraction. We reinforce this notion by defining intelligence solely on grades in school and numbers on IQ tests”. William Deresiewicz, an award-winning essayist and literary critic, builds on this by describing how “being at an elite college, and going on from an elite college – all involve numerical rankings. You learn to think of yourself in terms of those numbers”, while adding that “one of the great errors of an elite education, then, is that it teaches you to think that measures of intelligence and academic achievement are measures of value in some moral or metaphysical sense. But they’re not”.3 And they are both right.

Even so, Deresiewicz concedes that “the advantages of an elite education are indeed undeniable”. Yet, however incredible these elite institutions are, and however substantial the contribution they make to society, they also own a share of the responsibility for reinforcing divisions of people by class, occupation and intelligence. Deresiewicz hits the nail on the head when he says that “the problem begins when students are encouraged to forget this truth, when academic excellence becomes excellence in some absolute sense, when ‘better at X’ becomes simply ‘better’”. And that is the entrenched reality we need to overcome in order to redress the imbalance perpetuated by the prevailing system. One possible solution is to acknowledge, honor or even compensate those forms of intelligence that are not directly linked to formal education.

Yet attempting to change the way most people think is potentially a fool’s errand. But if we do not make any effort at all to change even slightly the way people make assumptions about intelligence, class and occupation, then life will carry on as it is and we will continue devaluing and degrading the contributions of millions – perhaps even billions – around the world. The onus is on us to make a difference, however small it may be. After all, as the sixth century B.C. Chinese philosopher Laozi pointed out, even “a journey of a thousand [miles] begins with a single step”.4 We already know that there are different ways to acquire knowledge and intelligence. So the journey we need to embark on does not involve uncharted territory. It simply requires a reorientation and retracing of steps to a fairer and more balanced society. Is that too much to ask?

1. https://theamericanscholar.org/blue-collar-brilliance/
2. http://www.bls.gov/oes/current/oes_nat.htm
3. https://theamericanscholar.org/the-disadvantages-of-an-elite-education/
4. https://en.wikiquote.org/wiki/Laozi#Tao_Te_Ching
  • You should also read this thought-provoking essay on education by Louis Menand - another of the three assigned to us.
  • mcgillianaire: (South Park Me)
    I'm in America. I've been here 3 weeks, and I'll be here for 9 more. The weather in Providence, RI is a lot warmer (and sunnier) than London - so far. I'm thoroughly enjoying it. I'm staying with my sister and future bro-in-law. I'm taking a couple of online courses from Harvard's continuing education school and a course to prepare for the GRE. The GRE is a standardized test for postgrad studies in 'Murica. I've decided to turn my back on the legal profession and return to university next year. I'll be applying for public policy degrees in the neighborhood. Boston is commuting distance so there are quite a few options to pick from. I definitely don't have the grades or accomplishments to even consider the likes of Harvard's Kennedy School of Government, but hopefully I will get admitted to the next rung of alternatives below it.

    My family would like me to remain in America after my postgrad degree, preferably close to my sister, but I am pretty clear in my mind that this is just a short adventure across the pond. That said, I am really looking forward to the opportunity of studying in America, and I am open to the idea of staying here for a year or two afterwards if I can secure a job in DC (or anywhere else, as long as it's in public policy). But I would like to return to London eventually.

    I haven't quite left permanently either. Once my three months on the visa waiver program ends, I'll be flying back to London for the Christmas period. As amazing as the weather is right now and as cool as it is to be in America, I miss Blighty. Thank fuck, if you'll pardon my French, for smartphones and tablets. And thank fuck for the BBC. The radio app has been a godsend. It's like I've never left. Although waking up to You & Yours has been an interesting experience; sort of like the opposite of waking up to Up All Night when I'm in Oman or India. And with free VPN apps, I've even been able to tune into Sky Sports to watch live events, while catching-up on the latest comedies via the iPlayer app!

    It was also interesting to vote in the Labour leadership election while sat on my computer here in America. I didn't give Jeremy Corbyn any of my nominations and instead plumped for Kendall, Cooper and Burnham in that order. None of my choices did well in the deputy leadership and London mayoral candidate election either. But nothing was as amusing as the media and shadow cabinet meltdown that greeted Corbyn's victory declaration. The Tories and right-wing media predictably labelled him a threat to humanity. And Blairites clearly didn't know what to do; cross the floor, jump ship or piss from inside the tent. Basically a raft of similar options that will not change the result in 2020.

    And poor Corbyn, the chap clearly wants politics to change, but I don't think he feels comfortable leading the circus. Leadership necessitates compromise, and if there is something that sets Corbyn apart, it is his principled consistency. Love or loathe him, he has made a career out of it. The leadership will be a test of his political ambition and nous, neither of which he has displayed until now. Yet there are many attributes that I admire in Corbyn (the backbencher), and it is refreshing that someone of his disposition has risen to the top of British politics.

    Alas, one wonders whether Labour should reduce itself to simply a party of protest, or seek to position itself as a government-in-waiting, ready to take over from the Tories at a general election. It's one thing to secure a thumping mandate from the cheerleading squad, quite another appealing to a wider electorate.

    I wasn't even bothered about his appearance at PMQs, at St Paul's cathedral, his insistence to remain silent during the national anthem, or the chaotic manner in which the shadow cabinet was formed. It reflected a person for whom substance matters over spin. But I can understand why the electorate may have viewed it differently. You know, the same people whose votes he needs in 2020. Corbyn faces an uphill battle. The Tories plan to reduce the number of MPs and re-draw constituency boundaries - largely to their benefit. And there's still no sign that Scotland will abandon the SNP. Which leaves about 50-75 marginals to gain from the Tories in order to form a government.

    Corbynistas are banking on three things: the 35% that didn't vote in May, old Labour UKIP voters and old Labour Green/Lib Dem voters. It's true, a lot of people didn't vote in May and Corbyn's election may inspire some people to vote for the first time/again. On the contrary, Labour voters who really don't like Corbyn's policies, but voted for Labour earlier this year, may jump ship too. It also remains to be seen whether young voters stick with Corbyn, if he continues to compromise on his principles (eg: accepting a role as a privy counsellor etc). As for old Labour UKIP voters, UKIP finished second in many Labour-held seats. There wouldn't be much point if those voters returned to Labour. Labour needs UKIP voters in Tory-held seats to 'return to the fold'. It's a big ask. One suspects such UKIP voters would not have been impressed with Corbyn's refusal to sing the national anthem at an event commemorating the Battle of Britain. And as for old Labour Green/Lib Dem voters, well they may gain a dozen seats or so that way, but what use will that be? They need at least 50. I just cannot see Corbyn winning a general election.

    It may all be be a moot point. Several pundits have chipped in with their predictions of how long they think Corbyn will last, ranging from a few days to three years. Even members of his shadow cabinet refuse to say with any conviction that he will fight the next general election. For what it's worth, my guess is between six months to a year. Once the novelty wears off, once conference season ends, once there are a few more media "gaffes", and once the opinion polls tank, we'll see whether he roughs it out. Unlike power-hungry careerists who would refuse to fall on their sword until the last possible moment, I think Mr Corbyn would recognise his role in a sinking ship and jump.

    One of Corbyn's illustrious predecessor's is often quoted (though perhaps incorrectly) as saying that a week is a long time in politics. Well, what a week it has been. To those who complained that politics had become a sterile affair, you've got your comeuppance. Now then, are you prepared for the consequences? I'll be watching from afar with interest.
    mcgillianaire: (Did You Know?)
    [SOURCE]

    "The world’s first parking meter, known as Park-O-Meter No. 1, is installed on the southeast corner of what was then First Street and Robinson Avenue in Oklahoma City, Oklahoma on this day in 1935.

    The parking meter was the brainchild of a man named Carl C. Magee, who moved to Oklahoma City from New Mexico in 1927. Magee had a colorful past: As a reporter for an Albuquerque newspaper, he had played a pivotal role in uncovering the so-called Teapot Dome Scandal (named for the Teapot Dome oil field in Wyoming), in which Albert B. Fall, then-secretary of the interior, was convicted of renting government lands to oil companies in return for personal loans and gifts. He also wrote a series of articles exposing corruption in the New Mexico court system, and was tried and acquitted of manslaughter after he shot at one of the judges targeted in the series during an altercation at a Las Vegas hotel.

    By the time Magee came to Oklahoma City to start a newspaper, the Oklahoma News, his new hometown shared a common problem with many of America’s urban areas–a lack of sufficient parking space for the rapidly increasingly number of automobiles crowding into the downtown business district each day. Asked to find a solution to the problem, Magee came up with the Park-o-Meter. The first working model went on public display in early May 1935, inspiring immediate debate over the pros and cons of coin-regulated parking. Indignant opponents of the meters considered paying for parking un-American, as it forced drivers to pay what amounted to a tax on their cars, depriving them of their money without due process of law.

    Despite such opposition, the first meters were installed by the Dual Parking Meter Company beginning in July 1935; they cost a nickel an hour, and were placed at 20-foot intervals along the curb that corresponded to spaces painted on the pavement. Magee’s invention caught on quickly: Retailers loved the meters, as they encouraged a quick turnover of cars–and potential customers–and drivers were forced to accept them as a practical necessity for regulating parking. By the early 1940s, there were more than 140,000 parking meters operating in the United States. Today, Park-O-Meter No. 1 is on display in the Statehood Gallery of the Oklahoma Historical Society."
    mcgillianaire: (South Park Me)
    [SOURCE]

    "The critically acclaimed 2002 biopic Walk The Line depicts the life and career of Johnny Cash from his initial rise to stardom in the 1950s to his resurgence following a drug-fueled decline in the 1960s. The selection of this time span made perfect sense from a Hollywood perspective, but from a historical perspective, it left out more than half of the story. There was still another dramatic resurgence to come in the second half of Johnny Cash’s 50-year career, which reached another low point on this day in 1986, when Columbia Records dropped him from its roster after 26 years of history-making partnership.

    Columbia first signed Johnny Cash in 1960, using a lucrative contract to lure him away from his Sun Records, his first label and also the early home of Elvis Presley, Jerry Lee Lewis and Carl Perkins. Cash’s first Columbia single, “All Over Again,” made the country Top 5, and his second, “Don’t Take Your Guns To Town” made it all the way to #1, while also crossing over to the pop Top 40. But the biggest hits of Cash’s career were yet to come, including an incredible eight #1 albums in an eight-year span: Ring of Fire: The Best of Johnny Cash (1963); I Walk The Line (1964); Johnny Cash’s Greatest Hits (1967); At Folsom Prison (1968); At San Quentin (1969); Hello, I’m Johnny Cash (1970); The Johnny Cash Show (1970); and Man In Black (1971). During this period, Johnny Cash established himself as a titanic figure in American popular culture while selling millions upon millions of records for Columbia, but by the mid-1980s, fashions in country music had shifted dramatically away from his old-school style, and the hits simply stopped coming.

    In 1986, having also recently dropped jazz legend Miles Davis from its roster of artists, Columbia chose to end its no-longer-profitable relationship with Johnny Cash. Cash did not remain professionally adrift for long, however, releasing four original albums and numerous re-recordings of earlier material over the next seven years on Mercury Records. But it was not until 1994 that Cash truly found his creative bearings again. That was the year that he released the album American Recordings, the first in a series of albums on the label of the same name headed by Rick Rubin, the original producer of the Beastie Boys and the co-founder, with Russell Simmons, of Def Jam Records.

    Under Rubin’s influence, Cash moved to a raw, stripped-down sound that proved to be enormously successful with critics, with country traditionalists and with hipster newcomers to country music. When his second Rubin-produced album, Unchained, won a Grammy for Best Country Album in 1998, American Recordings placed a full-page ad in Billboard magazine featuring a 1970 photo of Cash brandishing his middle finger under the sarcastic line of copy, “American Recordings and Johnny Cash would like to acknowledge the Nashville music establishment and country radio for your support.”

    Johnny Cash went on to have two more massively successful solo albums with American Recordings prior to his death in 2003. Rick Rubin went on to become co-head of Columbia Records in 2007."
    mcgillianaire: (Bedouin in Desert)
    [SOURCE]

    "On July 13, 1985, at Wembley Stadium in London, Prince Charles and Princess Diana officially open Live Aid, a worldwide rock concert organized to raise money for the relief of famine-stricken Africans. Continued at JFK Stadium in Philadelphia and at other arenas around the world, the 16-hour “superconcert” was globally linked by satellite to more than a billion viewers in 110 nations. In a triumph of technology and good will, the event raised more than $125 million in famine relief for Africa.

    Live Aid was the brainchild of Bob Geldof, the singer of an Irish rock group called the Boomtown Rats. In 1984, Geldof traveled to Ethiopia after hearing news reports of a horrific famine that had killed hundreds of thousands of Ethiopians and threatened to kill millions more. After returning to London, he called Britain’s and Ireland’s top pop artists together to record a single to benefit Ethiopian famine relief. “Do They Know It’s Christmas?” was written by Geldof and Ultravox singer Midge Ure and performed by “Band Aid,” an ensemble that featured Culture Club, Duran Duran, Phil Collins, U2, Wham!, and others. It was the best-selling single in Britain to that date and raised more than $10 million.

    “Do They Know It’s Christmas?” was also a No. 1 hit in the United States and inspired U.S. pop artists to come together and perform “We Are the World,” a song written by Michael Jackson and Lionel Ritchie. “USA for Africa,” as the U.S. ensemble was known, featured Jackson, Ritchie, Geldof, Harry Belafonte, Bob Dylan, Cyndi Lauper, Paul Simon, Bruce Springsteen, Tina Turner, Stevie Wonder, and many others. The single went to the top of the charts and eventually raised $44 million.

    With the crisis continuing in Ethiopia, and the neighboring Sudan also stricken with famine, Geldof proposed Live Aid, an ambitious global charity concert aimed at raising more funds and increasing awareness of the plight of many Africans. Organized in just 10 weeks, Live Aid was staged on Saturday, July 13, 1985. More than 75 acts performed, including Elton John, Madonna, Santana, Run DMC, Sade, Sting, Bryan Adams, the Beach Boys, Mick Jagger, David Bowie, Queen, Duran Duran, U2, the Who, Tom Petty, Neil Young, and Eric Clapton. The majority of these artists performed at either Wembley Stadium in London, where a crowd of 70,000 turned out, or at Philadelphia’s JFK Stadium, where 100,000 watched. Thirteen satellites beamed a live television broadcast of the event to more than one billion viewers in 110 countries. More than 40 of these nations held telethons for African famine relief during the broadcast.

    A memorable moment of the concert was Phil Collins’ performance in Philadelphia after flying by Concorde from London, where he performed at Wembley earlier in the day. He later played drums in a reunion of the surviving members of Led Zeppelin. Beatle Paul McCartney and the Who’s Pete Townsend held Bob Geldof aloft on their shoulders during the London finale, which featured a collective performance of “Do They Know It’s Christmas?” Six hours later, the U.S. concert ended with “We Are the World.”

    Live Aid eventually raised $127 million in famine relief for African nations, and the publicity it generated encouraged Western nations to make available enough surplus grain to end the immediate hunger crisis in Africa. Geldof was later knighted by Queen Elizabeth II for his efforts.

    In early July 2005, Geldof staged a series of “Live 8″ concerts in 11 countries around the world to help raise awareness of global poverty. Organizers, led by Geldof, purposely scheduled the concert days before the annual G8 summit in an effort to increase political pressure on G8 nations to address issues facing the extremely poor around the world. Live 8 claims that an estimated 3 billion people watched 1,000 musicians perform in 11 shows, which were broadcast on 182 television networks and by 2,000 radio stations. Unlike Live Aid, Live 8 was intentionally not billed as a fundraiser–Geldof’s slogan was, “We don’t want your money, we want your voice.” Perhaps in part because of the spotlight brought to such issues by Live 8, the G8 subsequently voted to cancel the debt of 18 of the world’s poorest nations, make AIDS drugs more accessible, and double levels of annual aid to Africa, to $50 billion by 2010."
    mcgillianaire: (Scale of Justice)
    [SOURCE]

    "In Furman v. Georgia, the U.S. Supreme Court rules by a vote of 5-4 that capital punishment, as it is currently employed on the state and federal level, is unconstitutional. The majority held that, in violation of the Eighth Amendment to the Constitution, the death penalty qualified as “cruel and unusual punishment,” primarily because states employed execution in “arbitrary and capricious ways,” especially in regard to race. It was the first time that the nation’s highest court had ruled against capital punishment. However, because the Supreme Court suggested new legislation that could make death sentences constitutional again, such as the development of standardized guidelines for juries that decide sentences, it was not an outright victory for opponents of the death penalty.

    In 1976, with 66 percent of Americans still supporting capital punishment, the Supreme Court acknowledged progress made in jury guidelines and reinstated the death penalty under a “model of guided discretion.” In 1977, Gary Gilmore, a career criminal who had murdered an elderly couple because they would not lend him their car, was the first person to be executed since the end of the ban. Defiantly facing a firing squad in Utah, Gilmore’s last words to his executioners before they shot him through the heart were, “Let’s do it.”"
    mcgillianaire: (South Park Me)
    [SOURCE]

    "On this day in 1905, some 450 people attend the opening day of the world’s first nickelodeon, located in Pittsburgh, Pennsylvania, and developed by the showman Harry Davis. The storefront theater boasted 96 seats and charged each patron five cents. Nickelodeons (named for a combination of the admission cost and the Greek word for “theater”) soon spread across the country. Their usual offerings included live vaudeville acts as well as short films. By 1907, some 2 million Americans had visited a nickelodeon, and the storefront theaters remained the main outlet for films until they were replaced around 1910 by large modern theaters.

    Inventors in Europe and the United States, including Thomas Edison, had been developing movie cameras since the late 1880s. Early films could only be viewed as peep shows, but by the late 1890s movies could be projected onto a screen. Audiences were beginning to attend public demonstrations, and several movie “factories” (as the earliest production studios were called) were formed. In 1896, the Edison Company inaugurated the era of commercial movies, showing a collection of moving images as a minor act in a vaudeville show that also included live performers, among whom were a Russian clown, an “eccentric dancer” and a “gymnastic comedian.” The film, shown at Koster and Bial’s Music Hall in New York City, featured images of dancers, ocean waves and gondolas.

    Short films, usually less than a minute long, became a regular part of vaudeville shows at the turn of the century as “chasers” to clear out the audience after a show. A vaudeville performers’ strike in 1901, however, left theaters scrambling for acts, and movies became the main event. In the earliest years, vaudeville theater owners had to purchase films from factories via mail order, rather than renting them, which made it expensive to change shows frequently. Starting in 1902, Henry Miles of San Francisco began renting films to theaters, forming the basis of today’s distribution system. The first theater devoted solely to films, The Electric Theater in Los Angeles, opened in 1902. Housed in a tent, the theater’s first screening included a short called New York in a Blizzard. Admission cost about 10 cents for a one-hour show. Nickelodeons developed soon after, offering both movies and live acts."
    mcgillianaire: (India Flag)


    It's not often an English pop song is a copy of a Tamil film song, but one example is American hip-hop artist will.i.am's "It's My Birthday", a UK number one hit single last year. It's surprising how this song escaped my notice, but it's always a pleasure to make such discoveries. Wikipedia confirms the connection between the two songs. Indeed there is a reference to the Tamil original in the opening lines of the English song. To come across this while listening to piano renditions of English pop songs on Spotify was especially gratifying, because I had just wondered whether Spotify also stored piano renditions of Tamil and Bollywood numbers. I still don't know the answer to that question, but you could be fooled into thinking there was at least one in the database.
    mcgillianaire: (South Park Me)


    "Lottery commercials are incredibly seductive and they're also everywhere. States spend half a billion on them every year and the reason they do that, is [that] the lottery is a massive moneymaker for them. Last year alone, lottery sales totalled about $68 billion. That's more than Americans spent last year on movie tickets, music, porn, the NFL, Major League Baseball, and video games, combined. Which means Americans spend more on the lottery than they spent on America." (Gorra love John Oliver.)
    mcgillianaire: (Geetopadesham)


    mcgillianaire: (Union Jack)


    Keen listeners of this delightful programme would not have been surprised to hear the contents of the recorded conversation between Thatcher and Reagan from 1983 that has just been released. The Radio 4 programme was broadcast in August last year, and we learnt from it via the Downing Street note of the conversation, that Reagan initially tried to defuse the situation, by suggesting he would first throw his hat into the room if he was in London, before walking in. We also learn that Reagan used the phrase 'zero hour' before he could do anything about it. Exactly as it is in the recording. If you've got 8 minutes, it's worth listening from about 3:40 to the whole section on Grenada from the UK Confidential episode. It includes a brief interview about the declassified documents with Lord Owen (former British Foreign Secretary) and an American diplomat who was working in the US Embassy (in London) at the time. It is rather instructive that the American diplomat had dinner with Geoffrey Howe (the then British Foreign Secretary), the night before the invasion, and yet neither knew anything about it! It is also worth noting that the American diplomat refers to the 1983 Beirut barracks bombing, that killed nearly 300 American and French servicemen just a couple days before the invasion, as a tragedy so severe that it may have resulted in the invasion as a diversion.

    On the recording, Reagan says he wanted to inform Thatcher of the invasion before some rogue informant did, but in an interview with the US President's authorised biographer on the wireless last night, this was quickly dismissed. The biographer was convinced Reagan was fibbing and had intentionally delayed informing her before it was too late (about 8 hours). However, the biographer also added that on two counts, Thatcher was somewhat embarrassed. One, was not responding to the situation in Grenada, having been requested (along with the French) to do so by their government, and two, she found herself in a similar situation to that of Reagan after Britain's own invasion of the Falklands, a year earlier. Yet despite these two foreign policy setbacks, they still seemed to share a politically intimate relationship. A point driven home by the biographer's final anecdote about a poster* Reagan kept in his stable, recreating the famous Rhett Butler-Vivien Leigh pose from Gone With The Wind, with the two of them on it instead. The biographer asked if he had shown it to Thatcher, to which Reagan said no-way, she'd get upset. The biographer apparently told him, on the contrary, I think she'd rather like it, mischievously adding that it was probably her ultimate fantasy... 

    I also found it interesting that the biographer seemed to suggest that the Americans were justified in their actions on the pretext of protecting the 500 or so American students on the island. In contrast, Lord Owen suggests that the students didn't seem worried at all, lending credence to alternative theories. Either way, the release of the recording has thrown further light onto an important episode in the history of Anglo-American relations. One just wonders what else will be released to us in days, weeks, months, years...even decades to come, 

    (* I don't think the picture above is the exact poster. This seems to be some anti-war poster from the 1980s, but I suspect it looked something like this.)

    mcgillianaire: (Did You Know?)
    Maverick:
    1867, "calf or yearling found without an owner's brand," so called for Samuel A. Maverick (1803-1870), Texas cattle owner who was negligent in branding his calves. Sense of "individualist, unconventional person" is first recorded 1886, via notion of "masterless."

    Gobbledygook:
    also gobbledegook, "the overinvolved, pompous talk of officialdom" [Klein], 1944, American English, first used by U.S. Representative Maury Maverick, Democrat-Texas, (1895-1954), a grandson of the original maverick and chairman of U.S. Smaller War Plants Corporation during World War II. First used in a memo dated March 30, 1944, banning "gobbledygook language" and mock-threateaning, "anyone using the words activation or implementation will be shot." Maverick said he made up the word in imitation of turkey noise. Another word for it, coined about the same time, was bafflegab (1952).

    SOURCE: Online Etymology Dictionary
    mcgillianaire: (Union Jack)


    I consider this speech one of the greatest ever and I remember the goosebumps I felt when I heard it for the first time ten-and-a-half years ago. Even after several dozen viewings a decade later, it doesn't fail to induce the same feelings. As the West prepares to attack Syria in the coming days, it's worth reminding ourselves of the arguments against military intervention without international agreement or domestic support. Robin Cook's passing was a great loss to British politics.

    EDIT @ 16.30, AUG 28:
    Does anybody recognise the other politicians, besides Cook and Corbyn, in that still-image of the video? I feel like I should know the names of the chap sitting immediately to Cook's right, and the chap sitting immediately behind him to his left (with hands crossed), but I haven't been able to figure it out in ten years, leaving me with little chance to figure out the others either. The chap on the top left of the screen reminds me of Richard Griffiths.

    Josh (v.)

    Aug. 24th, 2013 02:00 pm
    mcgillianaire: (Did You Know?)
    "to make fun of, to banter," 1845, American English, probably from the familiar version of the proper name Joshua, but just which Joshua, or why, is long forgotten. Perhaps it was taken as a typical name of an old farmer. The word was in use earlier than the career of U.S. humorist Josh Billings, pseudonym of Henry Wheeler Shaw (1818-1885), who did not begin to write and lecture until 1860; but his popularity after 1869 may have influence that of the word.
      About the most originality that any writer can hope to achieve honestly is to steal with good judgment. ["Josh Billings"]
    Related: Joshed; joshing.

    SOURCE: Online Etymology Dictionary
    mcgillianaire: (Default)
    The US is trying to force India to make a choice: support the UNSC sponsored report against Iran or lose the nuclear energy deal signed in July.

    A lot of different issues make the decision a complicated one for the bigwigs in New Delhi. One that surprised me however was the interesting fact that aside from housing the world's 2nd largest Muslim population (after Indonesia), India is also the home to the world's 2nd largest Shi'a population. Second to Iran, that is. With more than 20+ million Shi'as, growing domestic energy demands and only 3% of current energy requirements fulfilled by nuclear power; it will be quite interesting to see which path India takes.

    Iran's shadow over India-US relations - BBC News, September 20
    Historic breakthrough for India-US relations - BBC News, July 19
    Iran's Nuclear Program - Wikipedia

    Profile

    mcgillianaire: (Default)
    mcgillianaire

    2025

    S M T W T F S

    Syndicate

    RSS Atom

    Most Popular Tags

    Style Credit

    Expand Cut Tags

    No cut tags
    Page generated Jun. 22nd, 2025 04:02 am
    Powered by Dreamwidth Studios