Thursday, October 27, 2016

Enjoy a night of fear with these flicks

Carnival of Souls (1962)
Originally printed in the Batesville Daily Guard

Halloween is upon us and what better time to catch up on your horror movie watching?
Sure, you can say “I’m already watching ‘The Walking Dead,’” but that’s just a soap opera with zombies. No, what you want on Halloween is something that will make you feel uneasy, frightened or at least a little nervous … not a TV show that frustrates, angers or even bores you.

So, today I’m passing on a couple of movie suggestions for those who have a few hours to curl up in front of the TV Halloween night while the kids are out or have been put to bed. Each movie represents what I would call a different sort of fear so that the viewer gets a nice rounded experience.

“Carnival of Souls” (1962)

Ghost stories were being passed from generation to generation by humans way before Halloween was ever conceived. “Carnival of Souls” is a ghost story, but it isn’t about a haunted house or vengeance from beyond the grave. Instead it’s a movie about a protagonist, a woman who is the only survivor of a car wreck at the beginning of the film, growing more detached from, and eventually rejected by, the world and people around her. Eventually, the only ones who seem to notice her are the ghosts she sees more and more frequently.

The film never goes for outright scares. Instead, the events in the film build upon each other. The increasing isolation of the movie’s lead character from the world around her and the final confrontation with the ghosts, who appropriately gather to dance the night away at a carnival, leave the viewer with a sense of hopelessness and futility instead of fear.

“Night of the Living Dead” (1968)

If it weren’t for this movie, you wouldn’t be watching “The Walking Dead,” reading “World War Z” or playing “Resident Evil.” We’re so accustomed to cannibal zombies in media now that they’re no longer scary, but back when George Romero made “Night of the Living Dead,” they were something new.

Sure, there had been zombies in movies before, “White Zombie” jumps immediately to mind, but until “Night of the Living Dead,” they had been mostly portrayed true to their roots in African culture — a dead person brought to life to serve a single master. “Night of the Living Dead introduced much of the world to swarms of flesh-eating zombies that overwhelmed their human prey by sheer numbers. It also introduced that other zombie movie trope: You often have more to fear from your fellow man than the zombies.

“Texas Chainsaw Massacre” (1974)

Along with “Night of the Living Dead” this is considered one of the movies that modernized horror. Before these two movies, horror movies largely followed a monster-of-the-week type pattern where the monster died in the end and the, usually male, hero got out alive, if not unscathed. “Texas Chainsaw Massacre” turned that formula totally on its head. Not only was our “monster” human, the “hero” (survivor is more appropriate) is a woman.

It follows, and pretty much established, a formula we’ve grown used to. A bunch of teenagers or 20-somethings go out in the middle of nowhere and are killed one-by-one until the lone survivor fights off the killer/monster and makes it out alive and damaged. But unlike its lesser descendants, this movie goes for unsettling over disgusting. The Sawyer family, whose interior decorating was inspired by the home of Ed Gein — the real life inspiration for Norman Bates in “Psycho” and Buffalo Bill in “Silence of the Lambs” — and their twisted abode will cause several nights of restless sleep for the first-time viewer.

“The Thing” (1982)

John Carpenter is a director whose name is synonymous with Halloween. After all, he introduced the world to Michael Myers in, well, “Halloween.” Halloween was a very influential horror movie, but it was not Carpenter’s best. His best work would be just four years later when he delivered what I, and many others consider his masterpiece: “The Thing.”

“The Thing” is a violent film, no doubt about it. The violence isn’t from slashing, bludgeoning or drilling as we are prone to see in horror moves nowadays. Instead, the violence comes from the creature being revealed, transforming its human forms into something truly horrific. The scares don’t come from the boogeyman sneaking up behind you or coming out of the dark. Instead, he hides right in front of you, within the skin of your co-worker or friend, waiting to take you when it’s only the two of you in the room. Worse yet, he can be more than one person, or animal, too. All through the film, even in its last scene, you’re left to wonder “is it one of them?”

“Session 9” (2001)

By 2001, most mainstream horror movies were trending toward jump scares and way too much self-awareness thanks to movies like “Scream” and “I Know What You Did Last Summer.” “Session 9” is one of the few movies to buck that trend, largely hiring actors to play characters instead of pretty faces to play victims.

“Session 9” doesn’t start with a bang. It hardly has any bangs at all. Instead, you start out having a feeling something is very, very wrong on what should just be a regular job for a group of contractors removing asbestos from Danvers State Hospital, a real-life psychiatric hospital that once stood in Massachusetts. The movie builds on that feeling, every scene removing layers beginning with the normal and pealing away to the horrific.

The sense of unease is not lifted by the end of the movie, instead you’re left to wonder about who, or what, is to really blame for what unfolded. Danvers State Hospital is almost a character itself, with its decaying walls, massive empty rooms and small hallways. Danvers State Hospital was totally demolished in 2006.

Follow Joseph on Facebook.

Thursday, October 20, 2016

A table just for two

Originally printed in the Batesville Daily Guard

If there’s something that has become pretty obvious this election it’s that many, many Americans are not happy with their choices.


First, you have the major parties, the Republicans and Democrats, running two of the most unpopular candidates in history. According to RealClearPolitics.com, which averages several different polls, Democratic presidential nominee Hillary Clinton is averaging 52.4 percent as far as unfavorability and Republican nominee Donald Trump averages 60.8 unfavorable. 

That isn’t a good sign for the winner of this election, as the 2018 midterm elections will probably see their party lose seats in both the House and Senate.

Sure, there are options that aren’t Democrat or Republican, the most prominent being Gary Johnson of the Libertarian Party and Jill Stein of the Green Party. The thing with them is that winning the presidency just isn’t that realistic.

Of course, a lot of that can be blamed on our electoral system. Unlike most modern democracies, the U.S. uses the electoral college, which means we don’t directly vote for president, instead we vote for electors to cast a vote for us on behalf of the winner, who takes all the electoral points in a given state. The winner doesn’t need to have a majority of the popular vote, which is 50 percent-plus one, they just need an 269 electoral votes, which means they’ll get all of the state’s electors whether they win with 90 percent of the vote or with 43 percent of the vote.

But the lack of success of third parties can’t be blamed solely on the electoral college. Instead, they can also be partially blamed on the nature of the parties themselves.

You see, many of the third-parties are ideologically based, like the Libertarians and Greens. That means that they don’t have as much room to wiggle around and stay true to their base at the same time, unlike big-tent parties like the Republicans and Democrats, which are each made up of a variety of different stripes but remain flexible enough to target the all-important moderate voter in the general election.

This is hard to do for parties based on ideological principles instead of vote-winning. Expanding the tent can mean bringing on new members but also alienating old members who demand a certain level of purity. It’s also a challenge to reach out to moderate voters who tend to be less-focused on ideology and more concerned with issues that require a certain flexibility that many parties can’t provide.

And of course, the presidential debates, or in reality joint press conferences, also shut out other voices. You can thank the Commission on Presidential Debates for that.

The most successful third-parties in the U.S. have had the “big tent” potential, like the Bull Moose Party of the 1910s and the Reform Party of the 1990s. But those parties, which had opportunity to grow, quickly fizzled out when the strong personalities, like Theodore Roosevelt and Ross Perot, that founded them were no longer the driving force.

Now it seems more people want to support a third party more than ever. Unfortunately, you also have a polarizing election where people are willing to go the “lesser of two evils” routes because they know who they don’t want in office. It’s an election where many people are voting against someone instead of for them.

The problem is then not people desiring a choice, but often a lack of what they see as viable choices. That’s not a good way to inspire faith in our democracy.


So what’s it take for a third party to get somewhere? Probably a big tent that won’t blow over after one election. But it’s also got to have resilience to fight the establishment, whether it be the two-party system, electoral college or lack of coverage from the national media.

Follow Joseph on Facebook


Monday, October 17, 2016

Who's afraid of the popular vote?

I know that this is something all of us that pay more than a small amount of attention to presidential politics has heard complaints about at one time or another: The Electoral College.


For those of you who didn't pay attention in civics class or live outside the U.S., the Electoral College is essentially this — A body of people representing the states of the U.S., who formally cast votes for the election of the president and vice president.

See, in the U.S. we don't directly elect the president or the vice president. Sure, we have nationwide elections for the office but we are actually choosing "electors,” who usually pledge to vote for a particular candidates. Each U.S. state is given one elector for each of its U.S. Senator and Representative, so of course, larger states have more.

I, like many others over the years, would like to see it replaced by a popular vote for president.

Of course, if that were to happen we'd still have to decide one thing: Do we want to determine the president by whoever gets the most votes, or plurality, or whoever gets the majority? That's a very important difference.

You see, aside from the infamous elections of 2000, 1888, 1876 and 1826 where the candidate with the most votes lost, we've also had several presidential elections where the candidate who got the most votes won the presidency with lessthan 50 percent of the popular vote.

If you count the elections where the candidate with the most votes lost, you have 18 elections, 12 elections if you don't count the popular vote losers winning, in U.S. history where the winner had less than 50 percent of the votes. This includes both of Bill Clinton's wins, 43.01 percent in 1992 and 49.23 percent in 1996, Richard Nixon with 43.42 percent in 1968 and John Kennedy with 49.72 in 1960.

Now some people will point at that list and say “see, it works out!” But of course, that's when their guy wins.

It's not to say it hasn't worked out in some instances either. After all, I'm not writing the about “quality” of leadership, though the electoral college has a very mixed record on that — remember the George W. Bush presidency after all.

This year, it's very likely that the winner will win the popular vote but won't hit 50 percent plus one. I'm sure some people will be pleased with that and many of them are the same people who cursed the electoral college in 2000 when Bush beat Al Gore.

The most common argument I hear for the electoral college is “well, if we didn't have it, they'd only go to the big population city/states.” That's nonsense though. Less than 10 percent of Americans live in the boundaries of the country's 10 largest cities and only 35 percent live in the 10 largest metro areas, which includes suburbs and exburbs. These cities and metros are also not concentrated in one region or state and not necessarily “liberal” or “conservative” in their voting patterns.

Of course, one can argue that the bigger states would dominate candidates' interest. After all, why bother campaigning for a few hundred thousand votes in Wyoming or Vermont when you can be campaigning for millions in California? The 10 most populous states — California, Texas, Florida, New York, Pennsylvania, Illinois, Michigan, Ohio, North Carolina and Georgia — have 50 percent of the U.S. population.

But so what?

Those states already have much more influence than smaller states. California has 53 members in the U.S. House of Representatives while Vermont and Wyoming have one each. Their electoral votes reflect this California has 55 electoral votes and Wyoming and Vermont each have three. So they're still minor states, nothing is going to balance it out.
What the electoral system has done is essentially carve the states into a lot of “this is our state, no need to put much effort in here” and a few “we have to fight for these states.” It has essentially turned the country into three voting blocs, Blue states that Republicans don't bother with, Red states that Democrats don't bother with and Battleground states where almost all their effort goes. If you're not in a battleground state, you're pretty much ignored and often your vote doesn't matter because the people of your state are going to vote overwhelmingly for one candidate over the other. If your vote is going to be tossed, why even bother voting?

If the Electoral College were done away with, then those Red voters in Blue states and Blue voters in Red states would see their votes actually matter in a presidential race. It wouldn't be “Candidate X won Arkansas, all the Arkansas votes for Candidate Y can be tossed.” Can anybody justify doing that to people who turn out in the rain, cold and wind to wait in lines going around the block to get to the voting booth?

Which leads us to “third party” supporters, as they are called.

Those are people who are told “you're throwing away your vote” when they vote their conscience. It's not very fair to them nor is it to their candidates who are labeled “spoilers” by some and “nobodies” by others. If a candidate is able to get on enough ballots to win, don't they deserve a level playing field too? Should their supporters' votes go in the trash as well?

That's why I believe we should not only ditch the electoral college, but also a runoff election if a candidate fails to garner 50 percent plus one of the popular vote.

There are two ways to handle this: A runoff election or instant runoff voting.

Instant runoff voting is the simplest to explain, but also the more complicated of the two. Essentially, when you vote, you get a list of all the candidates who each have a box by their name. Using these boxes, you choose your favorite candidate by giving them a “1” and then your preferred alternative a “2” and then other candidates a “3” or “4” and so on, depending how long the list of candidates you are willing to vote for is. Less complicated ballots have the candidates listed twice, with on list for your first preference and the other for your “alternate preference.”

You see, simple sounding but at the same time, pretty complicated to pull off — but you'd at least know the winner right away.

The other one, a second round of elections is also easy to explain and probably easier to put in place.

A second round of elections means that the two candidates with the most votes go head to head in a one-on-one election after the general election. This means that the other candidates who didn't get enough votes to be in the top two are out of the runoff round.
While they may be out as candidates, the losers are also given a great amount of power considering the people who voted for them may be the deciding factor in who ultimately wins. The top two would have to win over those voters and which means the top-two hopefuls may need to seek the endorsements of their former rivals to win.

How much longer would a second round draw out an election? There's nothing set in stone. France held its last run-off election two weeks after the regular election in 2002 while Peru had its second round presidential election 57 days after the general election in 2016. I would hope the U.S. would stay toward the shorter end, allowing enough time for one last debate between the candidates.

In the U.S., Donald Trump and Hillary would be vying for not only voters who probably didn't show up for the first round, but also people who voted for Gary Johnson of the Libertarian Party and Jill Stein of the Green Party. According to FiveThirtyEight.com's polls-plus forecast, Clinton will have 48.6 percent of the popular vote, Trump 43.7, Johnson 6.2 and other candidates, including Stein will have 1.5 percent of the popular vote.
That means that Clinton only needs 1.4 percent plus one vote to win in a hypothetical runoff election. In order to win a runoff, Trump would have to not only win all of Johnson's supporters, but also 0.1 percent of those who voted other and a plus one, which would be very difficult to achieve, especially if many of Johnson's supporters decided to stay home.

Of course, people would still be unhappy with that result, or any, but that's just a regular part of democracy.

So someone may still ask “why go to a majority popular vote if the results are largely the same?”

Why? Because it not only opens the door a little bit more for those parties that aren't the Big D or Big R, but also gives them power even if they lose. It's closer to the American ideal that we have in the modern age, where everyone gets a fair shake and an equal voice. Instead of being told “you're throwing your vote away” and “don't hand the election to ______,” they're going to be told “we need your vote” and asked “what do we need to do
for your vote?”
That my friends, are what presidential elections are supposed to be about.

Thursday, September 22, 2016

Star Trek's legacy built on enduring vision

“Star Trek” turned 50 earlier this month, an event that was heralded with enthusiasm among fans and nerds (often one and the same) alike.

Of course, I’m speaking specifically of the original series, often referred to as TOS by fans, which premiered on CBS on Sept. 8, 1966 — a Thursday according to my smartphone.

Considering it launched 11 years before I was born, I really don’t know how it was received at the time of its initial airing. Even if I was alive, I doubt I would have been able to watch it. I grew up near Hardy after all, and the only channels that were available were KAIT, an ABC affiliate, and AETN (a PBS affiliate).

I do know the show had made quite an impression on a lot of people at the time of its initial airing and syndicated run through the 1970s.

Me, I can't really call myself a super fan of the show. No matter which series you speak of, whether it be the original one or any of its followups, there were lots of terrible episodes. Look at the first season of “Star Trek: The Next Generation.” If one were to watch the full first season, which I don't recommend, a person would scratch their head wondering “how on earth did it get a second?” I think anyone that even calls themselves a super fan will tell you what the bad episodes are … then recommend you watch them anyway.

Another thing that always bothered me about “Star Trek” in pretty much all of its incarnations, but especially the original, was the sheer amount of supposed aliens that were identical to humans. Of course, in the 1960s TV budgets were limited and make-up was still underdeveloped as a craft. Other series that dealt with humans traveling to (actually more like getting stuck on) alien worlds like “Land of the Giants” and “Lost in Space” was dominated by rather human-looking aliens too. So I can understand the why, but it is still one of those things that bothers me when I watch Star Trek and other old-school sci-fi shows.

Those other two shows I mentioned also saw several years of syndication after their initial runs too. Even when we first got cable TV at my house when I was a kid, it seemed like “Lost in Space” was on somewhere at any hour of the day.

But unlike “Star Trek,” those other shows didn’t endure in the public’s imagination. While “Star Trek” launched not one, but two, movie franchises and many different TV shows, its contemporaries faded away, living on in the land of digital sub channels.

So why did “Star Trek” endure?

Well, probably first and foremost is because it really connected with its fans. Or maybe a better word would be that it made an impression. An impression that would stick.

How did it do that? How about with its vision of the future. Instead of largely white male cast with one woman cast for eye candy, which was the norm for most science fiction until then, “Star Trek” had a crew of many men and women of several races holding a variety of positions and ranks based on neither physical trait. That was pretty revolutionary to some people.

Something like that can make an impression on people. Just ask actress Whoopi Goldberg, who said “When I was 9 years old, ‘Star Trek’ came on. I looked at it and went screaming through the house, ‘Come here, Mum, everybody, come quick, come quick, there’s a black lady on TV, and she ain’t no maid!’ I knew right then and there I could be anything I wanted to be.”

Goldberg would go on to play a recurring character named Guinan, who was a bartender, and sometimes counselor, on “Star Trek: The Next Generation.”

It also made an impression in other ways, like how we perceive technology.

Want proof? Check out your cellphone.

Although not exactly a kid at the time “Star Trek” first aired, he was 28 in 1966, Martin Cooper, who led the Motorola team that developed the first handheld mobile phone said that watching Captain Kirk using his communicator on the television show “Star Trek” inspired him to develop a handheld mobile phone. The mobile phone eventually evolved to the smartphone, which is likely years beyond those initial Star Trek communicators, save the teleportation part.

Other technology that showed up in “Star Trek” first: Tablets, smart watches, Bluetooth devices … just to name a few. Some other technologies featured on the show, like the food replicator, is on the verge of becoming reality by combining 3-D printing and knowledge of amnio acids and other building blocks of life. The USS Enterprise’s mission launched in 2245, but by the time the real 2245 gets here, its technology might be considered archaic.

But what is probably “Star Trek”’s most enduring legacy is that it helped make science fiction respectable. Sure, it still took time for the movies to catch up — studios still didn’t have faith that science fiction could be box office gold until Star Wars beat all expectations — but we can say “Star Trek” helped pave the way. It’s partially thanks to Star Trek that science fiction can be taken seriously now.

With “Star Trek: Discovery” premiering next year, we can't really expect it to take us where no one has gone before, but we can look forward to the ride.

Like Joseph on Facebook.

Tuesday, September 13, 2016

Happy belated birthday WWW

As many of you have probably heard the World Wide Web turned 25 years in August and for something so young, it feels like it’s been around forever and it’s something that’s hard to picture society functioning without.
It all started with CERN, which is the English name of the European Organization for Nuclear Research. CERN currently operates the largest particle physics laboratory in the world.
It wasn’t a very exciting beginning. The few visitors outside of the government, higher education and private tech spheres in August 1991 would be greeted by black text on a white background which stated “The WorldWideWeb (W3) is a wide-area hypermedia information retrieval initiative aiming to give universal access to a large universe of documents.”
There were several hyperlinks and a lot of text. But unlike what came before, it was easily navigable by people who weren’t technologically disposed.
What was the Internet like before the Web? Well, not very exciting. There were several of the building blocks we take for granted now in existence. Local area networks and wide-area networks existing since the 1950s. People had been capable of sharing files since the 1970s. The first online bulletin board came into existence in 1978. But they were not easily navigable and without the WWW there wasn’t the wide berth of connectivity we see now.
The creators of the World Wide Web used the concept of hypertext and had the goals to use the it to facilitate the sharing of information among researchers.
CERN announced that the World Wide Web would be free to anyone on April 30, 1993. This is what truly opened the floodgates. You can actually see a copy of the original first webpage at http://info.cern.ch/hypertext/WWW/TheProject.html.
Those first visitors who all of a sudden had a gate to the rest of the world open up to them via their computers might not have realized they were also the first to see what would become one of the building blocks of our 2016 world, instant communication between anyone, anywhere, anytime.
Although it would be a few years before the World Wide Web really exploded into the public consciousness, sometime around the mid-90s, Hollywood did not waste anytime jumping on it with movies like “The Lawnmower Man,” “The Net” and “Hackers” — all of which wound up being way off the mark. Hollywood led us to believe that the ‘net would artificially increase intelligence, like Lawnmower Man, and that anyone with a dial-up connection could find their way to top secret government information if they were persistent enough.
But the reality was a bit different. Hollywood, nor the entertainment business in general, didn’t really seem to think about the impact the internet would have on them or how we conduct business, worldwide, in general.
After all, in the wake of the Napster debacle of the early 2000s, when the music companies found out they could make a buck off downloading music the music store died, probably never coming back. Although they weathered it a little better, the bookstore will never quite be the same … being on the most part elaborate coffee shops that sell books on the side.
Even the way we watch movies at home changed. Sure, we’ve been able to watch movies online since the 1990s, but who really wanted to spend the whole day downloading a likely-ill gotten feature length film which would probably have a low-quality picture and spotty as it played? When dial up was finally left largely in the dust in the mid-2000s, that’s when streaming movies finally took off. Of course, the movie rental business went under.
With the onset of smartphones and tablets within the last decade, we can connect with the rest of the world from almost anywhere, given that there’s a wifi connection or a cell tower near by (the latter can lead to huge phone bills though). Now we can enjoy music, movies, TV, radio or books from pretty much anywhere. We can also bank, make sure our homes are secure, shop for groceries and other daily necessities before we even leave work for the day.
But the most important legacy of that first WWW is probably making instant worldwide communication accessible to everyone. It’s changed the way families keep in touch, expanded our social circle beyond the local physical settings and, of course, changed the way many people meet their mate.
Think of it this way. Many of those couples who met on the internet in its early days, the 1990s, now have children who are at an age where they can marry and have children of their own. After all, 22 percent of people meet their partners online now, second only behind mutual friends at 24 percent, according to eHarmony Australia’s 2015 Relationship Study. That’s a big turnaround from the time when meeting people online was considered the domain of rejects and outcasts. Now, it looks like times have changed, and those nerds and outcasts might be the ones who inherit the earth.

Originally printed in the Batesville Daily Guard
Follow Joseph of Facebook

Thursday, September 8, 2016

Buffet roulette

Originally printed in the Batesville Daily Guard
Remember those sick days you took off from school as a kid?
I sure do.
I, as well as many of my peers, would often exaggerate our discomfort to get those little breaks to stay home and do other things, usually watch TV and play video games. Those slightly elevated temperatures, red eyes or tight bellies were like the Golden Ticket out of the classroom if we brought them to the school nurse.
Of course, there was a huge difference between being “sick” and being really sick.
Really sick wasn’t much fun. Unlike the occasional cold or too-greasy/spicy food from the night before, one couldn’t simply take a Tylenol or a shot of Pepto Bismol, and feel good enough to watch TV or play video games. Really sick came hand in hand with being really miserable, often not able to get out of bed. 
I experienced “really sick” a few times in my youth. Usually, it’d be from catching one strain of the flu or another at school. The worst was when I caught pneumonia, which caused me to be bedridden for a week, lacking the energy to get up and even eat.
It was different than getting really sick as an adult. Unlike a kid in school, even having to leave work early can have serious repercussions. For many, it could mean that few hours they missed could cause them to be late on a bill or buying fewer groceries. For those that wind up missing days, it could cost them getting that month’s rent or car payment in on time.
Some of us are lucky enough to have understanding employers and the type of job that let us make up those hours we were absent or have adequate sick leave. For that, many of us should feel blessed. But there are still many that don’t have that luxury.
But back to myself, I got really sick last week. Not only was I sick, but so was the rest of my family. We believe that we caught the norovirus; our symptoms fit it, anyway. Norovirus is, and here’s the disgusting part, a disease spread by fecal-oral transmission. We believe we came into contact with it while eating at a buffet in a different city (from my experience, the buffets in Batesville have been great). I wasn’t ignorant of the fact that there was always a chance, it’s just that in the past, my risks have always paid off in a full belly and nothing more.
As they say “you play with fire ...”
Norovirus infection can cause nausea, vomiting, watery diarrhea, abdominal pain and loss of taste. It also causes lethargy, weakness, muscle aches, headaches and low-grade fevers. Luckily it’s not usually dangerous and most people who contract it make a full recovery within two to three days.
Let’s just say it’s a very unpleasant experience.
The norovirus causes 19 million to 21 million illnesses every year, The outbreaks are usually in crowded environments like nursing homes, day care centers and cruise ships. Young children and the elderly tend to suffer the worst effects. The norovirus also causes 570 to 800 deaths each year, according to the CDC.
What’s the best way to prevent it? Sufficient heating, chlorine-based disinfectants and polyquaternary amines, and washing your hands if you’re serving food. There’s also a vaccine, developed by the Japanese, in the human testing phase.
But even with the vaccine, I’d still rather not unknowingly consume fecal matter in the first place.
In my case, things weren’t handled this way, so I got sick. Not only that, but my entire family got sick as well. It was a terrible experience, with both my wife and I feeling so weak that comforting our sick son seemed to take everything we had.
On the bright side, it’s a great way to lose weight if you’re looking to do so. In the span of 24 hours I managed to lose five pounds. Of course, as soon as my appetite returned, I started working on putting those five pounds back on.
So next time you eat out, make sure you pay attention to how your food is being handled. It may save you a few days of misery.
Follow Joseph on Facebook

Tuesday, August 23, 2016

Cultured meat is a future feast

Say hello to the future. The development of cultured meat
opens the door to solving many of the world's food problems
as well as environmental issues.
It might have sounded like science fiction just a few years ago, but today it seems we are just few years away from having meat and leather that don't require an animal to be killed. These cultured meat and leather products are grown in just a few labs right now, but one day will be mass produced in large facilities.

This may sound unlikely to many people, especially the detractors who point at the very first lab-grown burger that presented to the public just three years ago. That burger was cooked and eaten at a news conference on Aug. 5, 2013, to a mixed reception. But the biggest news wasn't that the burger was grown or that it tasted like a real burger, but that it cost $325,000 to produce.

Even today, when it was reported that the same burger would cost $11.36 to produce in 2015, a drop of $324,988.64 in just two years, detractors still point at the obsolete $325,000 price tag.

But despite the semi-skeptical reception and purveying anti-biotech attitude of many in the public, lab-grown animal products are still moving forward. The latest development being the announcement that Brooklyn-based startup Modern Meadow, who on June 28 secured $40million in Series B Round funding, bringing their total funds raised to $53.5 million.

“Modern Meadow harnesses the combined power of design, biology and engineering to change the way we think about materials, unlocking the capabilities of nature. Leather, which represents a $100 billion raw material market, has always been prized for its beauty, functionality and enduring status, according to Modern Meadow CEO and co-founder Andras Forgacs. “Today, as a co-product of the meat industry, it is subject to fluctuations in availability, quality, price and growing demand. At Modern Meadow, we’re re-imagining this millennia-old material to create revolutionary new features without harming animals or the environment.”

Millions have also been invested in research and development for other companies, like Memphis Meats, a meat-growing startup.

According to Memphis Meats CEO and Co-founder, Uma Valeti, their goal is to have the meat available for retail by 2021.

Cultured meat is expected to have a widespread impact. It's being touted for producing as little as 4 percent of the greenhouse gases produced by livestock, which is a positive for many of those concerned by environmental issues. This impact would also reach water bodies, which would be less susceptible from runoff that includes animal feces. Another big plus is that it would only require a fraction of the land required for cattle.

One example of the impact that raising livestock can be seen in our neighbors to the south in Central and South America. Since 1960 more than a quarter of rain-forest has been cleared for raising cattle and 70 percent in Costa Rica and Panama has been destroyed in conversion to rearing livestock, while in Brazil 40 percent of the land has been cleared for beef production, according to research biologist Brian J. Ford.

The livestock sector consumes 8 percent of all the fresh water in the world and occupies almost one-third of the world’s surface that isn't not covered by ice and permafrost. It also contributes 18 per cent of greenhouse gases to the atmosphere.

Then, there's also the idea that it could have a significant impact on the scarcity of food in the world. We are looking at a world population that is expected to keep increasing at least through 2100. Even though worldwide birthrates are declining — the average was 5.0 births per woman in 1960 and 2.5 births per woman in 2014, according to the World Bank — the population is living longer. Women in the U.S. alone are expected to have an average lifespan of 89-94 by 2050, according to the MacArthur Research Network, men lag behind at 83-86 years, according to ABC News. So, even though the number of people entering the world is decreasing, the rate at which people leaving it is decreasing as well.

Essentially, that means there's going to be a lot of poor people in the world to feed and lots of people see cultured meat as the way to do that.

But is it safe? Some people are concerned about it being unnatural and whether it's genetically modified.

According to New Harvest, cultured foods are unnatural in the same way that bread, cheese, yogurt, and wine are. Like those foods, cultured meat involves processing ingredients derived from natural sources. They also claim that production of cultured meat is less unnatural than raising farm animals in intensive confinement systems, That's because in intensive farming systems use synthetic hormones, and artificial diets made up of antibiotics and animal wastes. Furthermore, the conventional production of meat has led to a number of health and environmental problems, including high rates of heart disease and food-borne illness, as well as soil and water pollution from farm animal wastes.

What kind of impact will it make? Likely a gradual one. After all, considering the power of industry lobbyists in the U.S., you can expect it to be tied up for years even after it's on the shelves elsewhere.

And why wouldn't the industry fight? You are looking at an industry that directly employs 482,100 workers in the U.S. who have combined salaries of more than $19 billion, according to the North American Meat Institute. While that's by far not the largest industry in the U.S., the people it employs would still have to find something else to do.

As the many people who have lost manufacturing jobs over the 20th century can say, technology changes things. A lot of those changes lead to at least temporary job loss. It's one of those things that comes with modernization. People, especially those who have their livelihoods tied to a given field, will resist.

But supply and demand will inevitably reign supreme and the majority of people will go for what's cheaper as long as it doesn't taste bad. That's just how the world goes.

This isn't to say you shouldn't expect animals to not be still raised for food. Except with jacked-up prices for “real meat.” Same product, different marketing strategy.

Whatever happens, it looks like cultured meat is coming and it's going to change the world as we know it.

Follow Joseph on Facebook or Twitter.


Friday, August 19, 2016

Golden rice may be Vitamin A jackpot

 July 27, 2016
Ordinary rice to the left, golden rice to the right.
Originally printed in the Batesville Daily Guard
Ever hear of Vitamin A?
Sure you have, it’s up there with Vitamins B, C, D, E and K as being essential to a healthful life.
Vitamin A plays a critical role in the maintenance of the body in regards to vision, neurological function, healthy skin, building strong bones, gene regulation, cell differentiation and immune function. It is an antioxidant, thus is involved in reducing inflammation through fighting free radical damage. A high antioxidant diet is a way to naturally slow aging.
The best sources for Vitamin A are eggs, milk, liver, carrots, yellow or orange vegetables such as squash, spinach, and other leafy green vegetables.
But in many parts of the world, many of these things are unavailable in the necessary quantities. This is especially true in areas where the overpopulation and poverty are the norm. The most vulnerable people are the children of Africa and Southeast Asia.
There is one food that is widely available in these parts of the world, though: rice.
The problem though is that rice doesn’t have enough vitamin A to be effective. Naturally, anyway.
That’s where Golden Rice comes in.
Golden Rice is a genetically modified organism. Unlike regular rice, it carries beta-carotene, a major source of Vitamin A, which gives it the color for which it’s named. Like many GMOs, it contains genes that don’t originate in rice. The genes come from daffodils and a bacteria known as Erwinia. I know the word “bacteria” sounds scary to people, but remember, bacteria are on the most part tiny, tiny plants. Like plants, some bacteria are beneficial to us and some are bad for us. Luckily, golden rice has passed the safety standards and is safe for human consumption, like most GMOs on the market.
Clinical trials with adult volunteers from the U.S. concluded that “beta carotene derived from golden rice is effectively converted to vitamin A in humans,” according to the American Journal of Clinical Nutrition. The American Society for Nutrition said that “Golden Rice could probably supply 50 percent of the Recommended Dietary Allowance (RDA) of vitamin A from a very modest amount — perhaps a cup — of rice, if consumed daily.
It sounds good, right? It’s even got the support of the Bill and Melinda Gates Foundation.
Well, instead of sounding like a way to help millions, for many, it was akin to opening Pandora’s Box.
Many anti-GMO activists, particularly Greenpeace, have made it their mission to prevent Golden Rice from being planted by farmers in Vitamin A-poor parts of the world. Aside from spreading conspiracy theories about biotechnology companies, particularly Monsanto, they also attack the plots where the rice itself is grown. In 2013 an trial plot of Golden rice was uprooted by a gang of protesters in the Philippines, claiming that U.S. corporations were only seeking profit.
But why the resistance?
Greenpeace claims “... GE ‘Golden’ rice is a proposed but not practically viable crop solution that has never been brought to market. It is also environmentally irresponsible and could compromise food, nutrition and financial security.” Of course, they never offer any evidence to support their beliefs. Instead, we get inaccurate claims that farmers can’t “save their seeds” or “the rice will contaminate existing species.” 
Greenpeace has already been taken to task over this by 110 Nobel Prize Laureates in a letter, pleading with them to stop with the fear mongering.
Greenpeace’s response: The Nobel Prize Laureates didn’t offer “relevant expertise.”
Unfortunately for Greenpeace, nobody aside from anti-GMO activists are getting on board with them. Farmers associations in Nigeria support moving ahead with the cultivation of the rice, as does the Philippine Rice Research Institute. Anti-GMO activists accuse the groups and governments supporting Golden Rice consumption as being “bought by corporations” and have voiced support for radical groups that attack the farms where the rice is grown, destroying the crop.
So does the radical anti-GMO crowd offer an alternative solution?
“Plant sweet potatoes.”
Follow Joseph on Facebook or Twitter.

Wednesday, August 17, 2016

Whose law?

This statue of Baphomet is looking for a home and
Arkansas is on the list.
Just slightly more than a week ago, Arkansas State Senator Jason Rapert announced on his Facebook page that “After several months of waiting for the American History & Heritage Foundation attorneys to finish application paperwork for the Arkansas Secretary of State, I am advised they are now submitting the paperwork to begin the process of site selection approval for the Arkansas Ten Commandments Monument!”
Thus marking a milestone in a process that has caused a great deal of debate in Arkansas where the separation of church and state should be. Historically speaking, a Ten Commandments Monument would be a violation of it. 
Not that lawmakers don't try.
Oklahoma famously passed a law allowing for privately-funded religious monuments on the state capitol grounds. The kicker was, that it was open to all religions as long as they could afford to pay for their own monuments. 
This backfired on them when the Temple of Satan did just that. 
Of course, the Ten Commandments in Oklahoma came down pretty fast after lawmakers learned there was no way to stop the Temple of Satan to put up a monument of its own on state capitol grounds. After all, it was open to all relgions.
In Arkansas, state lawmakers are trying to avoid what happened in Oklahoma by proclaiming that the Ten Commandments monument is not a religious monument. The argument that is being used is “the Ten Commandments aren't a religious document, but the historical foundation of our laws.” 
The Arkansas Ten Commandment Display Act states:
“The Ten Commandments represent a philosophy of government held by many of the founders of this nation and by many Arkansans and other Americans today, that God has ordained civil government and has delegated limited authority to civil government, that God has limited the authority of civil government, and that God has endowed people with certain unalienable rights, including life, liberty, and the pursuit of happiness;
“In order that they may understand and appreciate the basic principles of the American system of government, the people of the United States of America and of the State of Arkansas need to identify the Ten Commandments, one of many sources, as influencing the development of what has 5 become modern law;
“The placing of a monument to the Ten Commandments on the grounds of the Arkansas State Capitol would help the people of the United States and of the State of Arkansas to know the Ten Commandments as the moral foundation of the law.”
The problem is that the Ten Commandments are mostly religious and moral rules, not the foundation of Western law. Even the Act itself sounds religious in nature as it makes mention of God, divine endowments and morality — none of those things that most people believe that the state should be involved in. 
“What?” you might say. “But every elected official says they are historical!”
Well, that may be so but the thing about elected officials is that they often tell people what they want to hear, especially if it's their base. In Arkansas, as well as most of the South, religious voters are a very big base, if not the biggest. Their support is what keeps many lawmakers in office. Things like the Ten Commandments Monument makes the more fundamentalist-leaning voters happy.
The problem is, most of the things on there aren't crimes, nor were they crimes in the times of the Founding Fathers of the U.S.
Of the Ten Commandments, there are only three that are actually crimes: Thou shall not murder, steal or bear false witness. The rest are basically good advice (don't cheat on your spouse, don't be a jealous jerk) or rules pertaining to practicing the faith (no graven images, no taking God's name in vein) which almost all self-proclaimed religious folks break on a daily basis anyway.
On top of all that, of the things that are actually illegal in the Ten Commandments, all of them were illegal before Judaism and Christianity had their respective boom periods in the Mediterranean region, which is pretty much the cradle of Western civilization.
Chances are, those things were probably fit for some sort of punishment even before the time of writing. But it's one figure, Hammurabi, that made the law famous. 
Hammurabi, who lived from 1810 BC to 1750 BC. He was the sixth king of the First Babylonian Dynasty, reigning from 1792 BC to 1750 BC. Written in stone, his famous code had 252 laws. Among them were rules against stealing, murder and bearing false witness. Also among them is the famous “eye for an eye” rule. 
The punishment called for by Hammurabi's Code was very uneven and depended on the perpetrator's social status. A poor perpetrator would always face harsher punishment, often death, while the rich criminal often paid only a fine. 
Fortunately, no such uneven dispensation of justice exists in our society today. (For those of you that are a little dense, that's supposed to be sarcasm.)
But, even Hammurabi can't be given credit for first transcribing those three laws. He was late by a couple of hundred years. The first person, as far as we know, to actually have laws against murder, stealing and false witness transcribed is another Mesopotamian — Ur-Nammu, founder of the third Sumerian Dynasty. Ur-Nammu was believed to have lived around 2030 BC. 
Moses' birthdate, on the other hand, is believed to have been around 1400 BC, hundreds of years after Hammurabi's and Ur-Nammu's deaths. By the time Moses came along, laws against murdering, lying and bearing false witness were not only standard across the region, but across the known world as well. 
Thus there's just not a valid case that the Ten Commandments are the basis of Western law. By the time that the Romans started converting to Christianity, such laws were already put in place centuries before by polytheists. 
Now one can try to make an argument that the Ten Commandments are the “moral” basis for Western law, but even after the the Christianization of the West, morality was a very flexible thing. After all, it's doubtful that Moses would approve of the Trinity, Saints, crosses or various iconography that we see in regards to Christianity today.
Other things, like coveting and honoring one's parents, have never really been addressed by Western law to any meaningful extent. Looking at the leaders of old, how many of them launched wars out of greed or dishonored their parents? A lot.
So where does that leave us?
It leaves us a system based on man's law. Our secular law is not only supposed to protect Christians, but hold them at equal footing with the likes of Muslims, Atheists and modern Pagans. If we let one or the other take control of the law, there'd be nothing for anyone else.  
Sen. Rapert might want to take this into consideration. After all, he's one of the beneficiaries of secular law, which allowed for his ancestors to pass their faith onto him. If the U.S. was founded on religion, it's possible he wouldn't be following the “right” form of it. 

Like Joseph on Facebook or Twitter