Thursday, September 22, 2016

Star Trek's legacy built on enduring vision

“Star Trek” turned 50 earlier this month, an event that was heralded with enthusiasm among fans and nerds (often one and the same) alike.

Of course, I’m speaking specifically of the original series, often referred to as TOS by fans, which premiered on CBS on Sept. 8, 1966 — a Thursday according to my smartphone.

Considering it launched 11 years before I was born, I really don’t know how it was received at the time of its initial airing. Even if I was alive, I doubt I would have been able to watch it. I grew up near Hardy after all, and the only channels that were available were KAIT, an ABC affiliate, and AETN (a PBS affiliate).

I do know the show had made quite an impression on a lot of people at the time of its initial airing and syndicated run through the 1970s.

Me, I can't really call myself a super fan of the show. No matter which series you speak of, whether it be the original one or any of its followups, there were lots of terrible episodes. Look at the first season of “Star Trek: The Next Generation.” If one were to watch the full first season, which I don't recommend, a person would scratch their head wondering “how on earth did it get a second?” I think anyone that even calls themselves a super fan will tell you what the bad episodes are … then recommend you watch them anyway.

Another thing that always bothered me about “Star Trek” in pretty much all of its incarnations, but especially the original, was the sheer amount of supposed aliens that were identical to humans. Of course, in the 1960s TV budgets were limited and make-up was still underdeveloped as a craft. Other series that dealt with humans traveling to (actually more like getting stuck on) alien worlds like “Land of the Giants” and “Lost in Space” was dominated by rather human-looking aliens too. So I can understand the why, but it is still one of those things that bothers me when I watch Star Trek and other old-school sci-fi shows.

Those other two shows I mentioned also saw several years of syndication after their initial runs too. Even when we first got cable TV at my house when I was a kid, it seemed like “Lost in Space” was on somewhere at any hour of the day.

But unlike “Star Trek,” those other shows didn’t endure in the public’s imagination. While “Star Trek” launched not one, but two, movie franchises and many different TV shows, its contemporaries faded away, living on in the land of digital sub channels.

So why did “Star Trek” endure?

Well, probably first and foremost is because it really connected with its fans. Or maybe a better word would be that it made an impression. An impression that would stick.

How did it do that? How about with its vision of the future. Instead of largely white male cast with one woman cast for eye candy, which was the norm for most science fiction until then, “Star Trek” had a crew of many men and women of several races holding a variety of positions and ranks based on neither physical trait. That was pretty revolutionary to some people.

Something like that can make an impression on people. Just ask actress Whoopi Goldberg, who said “When I was 9 years old, ‘Star Trek’ came on. I looked at it and went screaming through the house, ‘Come here, Mum, everybody, come quick, come quick, there’s a black lady on TV, and she ain’t no maid!’ I knew right then and there I could be anything I wanted to be.”

Goldberg would go on to play a recurring character named Guinan, who was a bartender, and sometimes counselor, on “Star Trek: The Next Generation.”

It also made an impression in other ways, like how we perceive technology.

Want proof? Check out your cellphone.

Although not exactly a kid at the time “Star Trek” first aired, he was 28 in 1966, Martin Cooper, who led the Motorola team that developed the first handheld mobile phone said that watching Captain Kirk using his communicator on the television show “Star Trek” inspired him to develop a handheld mobile phone. The mobile phone eventually evolved to the smartphone, which is likely years beyond those initial Star Trek communicators, save the teleportation part.

Other technology that showed up in “Star Trek” first: Tablets, smart watches, Bluetooth devices … just to name a few. Some other technologies featured on the show, like the food replicator, is on the verge of becoming reality by combining 3-D printing and knowledge of amnio acids and other building blocks of life. The USS Enterprise’s mission launched in 2245, but by the time the real 2245 gets here, its technology might be considered archaic.

But what is probably “Star Trek”’s most enduring legacy is that it helped make science fiction respectable. Sure, it still took time for the movies to catch up — studios still didn’t have faith that science fiction could be box office gold until Star Wars beat all expectations — but we can say “Star Trek” helped pave the way. It’s partially thanks to Star Trek that science fiction can be taken seriously now.

With “Star Trek: Discovery” premiering next year, we can't really expect it to take us where no one has gone before, but we can look forward to the ride.

Like Joseph on Facebook.

Tuesday, September 13, 2016

Happy belated birthday WWW

As many of you have probably heard the World Wide Web turned 25 years in August and for something so young, it feels like it’s been around forever and it’s something that’s hard to picture society functioning without.
It all started with CERN, which is the English name of the European Organization for Nuclear Research. CERN currently operates the largest particle physics laboratory in the world.
It wasn’t a very exciting beginning. The few visitors outside of the government, higher education and private tech spheres in August 1991 would be greeted by black text on a white background which stated “The WorldWideWeb (W3) is a wide-area hypermedia information retrieval initiative aiming to give universal access to a large universe of documents.”
There were several hyperlinks and a lot of text. But unlike what came before, it was easily navigable by people who weren’t technologically disposed.
What was the Internet like before the Web? Well, not very exciting. There were several of the building blocks we take for granted now in existence. Local area networks and wide-area networks existing since the 1950s. People had been capable of sharing files since the 1970s. The first online bulletin board came into existence in 1978. But they were not easily navigable and without the WWW there wasn’t the wide berth of connectivity we see now.
The creators of the World Wide Web used the concept of hypertext and had the goals to use the it to facilitate the sharing of information among researchers.
CERN announced that the World Wide Web would be free to anyone on April 30, 1993. This is what truly opened the floodgates. You can actually see a copy of the original first webpage at http://info.cern.ch/hypertext/WWW/TheProject.html.
Those first visitors who all of a sudden had a gate to the rest of the world open up to them via their computers might not have realized they were also the first to see what would become one of the building blocks of our 2016 world, instant communication between anyone, anywhere, anytime.
Although it would be a few years before the World Wide Web really exploded into the public consciousness, sometime around the mid-90s, Hollywood did not waste anytime jumping on it with movies like “The Lawnmower Man,” “The Net” and “Hackers” — all of which wound up being way off the mark. Hollywood led us to believe that the ‘net would artificially increase intelligence, like Lawnmower Man, and that anyone with a dial-up connection could find their way to top secret government information if they were persistent enough.
But the reality was a bit different. Hollywood, nor the entertainment business in general, didn’t really seem to think about the impact the internet would have on them or how we conduct business, worldwide, in general.
After all, in the wake of the Napster debacle of the early 2000s, when the music companies found out they could make a buck off downloading music the music store died, probably never coming back. Although they weathered it a little better, the bookstore will never quite be the same … being on the most part elaborate coffee shops that sell books on the side.
Even the way we watch movies at home changed. Sure, we’ve been able to watch movies online since the 1990s, but who really wanted to spend the whole day downloading a likely-ill gotten feature length film which would probably have a low-quality picture and spotty as it played? When dial up was finally left largely in the dust in the mid-2000s, that’s when streaming movies finally took off. Of course, the movie rental business went under.
With the onset of smartphones and tablets within the last decade, we can connect with the rest of the world from almost anywhere, given that there’s a wifi connection or a cell tower near by (the latter can lead to huge phone bills though). Now we can enjoy music, movies, TV, radio or books from pretty much anywhere. We can also bank, make sure our homes are secure, shop for groceries and other daily necessities before we even leave work for the day.
But the most important legacy of that first WWW is probably making instant worldwide communication accessible to everyone. It’s changed the way families keep in touch, expanded our social circle beyond the local physical settings and, of course, changed the way many people meet their mate.
Think of it this way. Many of those couples who met on the internet in its early days, the 1990s, now have children who are at an age where they can marry and have children of their own. After all, 22 percent of people meet their partners online now, second only behind mutual friends at 24 percent, according to eHarmony Australia’s 2015 Relationship Study. That’s a big turnaround from the time when meeting people online was considered the domain of rejects and outcasts. Now, it looks like times have changed, and those nerds and outcasts might be the ones who inherit the earth.

Originally printed in the Batesville Daily Guard
Follow Joseph of Facebook

Thursday, September 8, 2016

Buffet roulette

Originally printed in the Batesville Daily Guard
Remember those sick days you took off from school as a kid?
I sure do.
I, as well as many of my peers, would often exaggerate our discomfort to get those little breaks to stay home and do other things, usually watch TV and play video games. Those slightly elevated temperatures, red eyes or tight bellies were like the Golden Ticket out of the classroom if we brought them to the school nurse.
Of course, there was a huge difference between being “sick” and being really sick.
Really sick wasn’t much fun. Unlike the occasional cold or too-greasy/spicy food from the night before, one couldn’t simply take a Tylenol or a shot of Pepto Bismol, and feel good enough to watch TV or play video games. Really sick came hand in hand with being really miserable, often not able to get out of bed. 
I experienced “really sick” a few times in my youth. Usually, it’d be from catching one strain of the flu or another at school. The worst was when I caught pneumonia, which caused me to be bedridden for a week, lacking the energy to get up and even eat.
It was different than getting really sick as an adult. Unlike a kid in school, even having to leave work early can have serious repercussions. For many, it could mean that few hours they missed could cause them to be late on a bill or buying fewer groceries. For those that wind up missing days, it could cost them getting that month’s rent or car payment in on time.
Some of us are lucky enough to have understanding employers and the type of job that let us make up those hours we were absent or have adequate sick leave. For that, many of us should feel blessed. But there are still many that don’t have that luxury.
But back to myself, I got really sick last week. Not only was I sick, but so was the rest of my family. We believe that we caught the norovirus; our symptoms fit it, anyway. Norovirus is, and here’s the disgusting part, a disease spread by fecal-oral transmission. We believe we came into contact with it while eating at a buffet in a different city (from my experience, the buffets in Batesville have been great). I wasn’t ignorant of the fact that there was always a chance, it’s just that in the past, my risks have always paid off in a full belly and nothing more.
As they say “you play with fire ...”
Norovirus infection can cause nausea, vomiting, watery diarrhea, abdominal pain and loss of taste. It also causes lethargy, weakness, muscle aches, headaches and low-grade fevers. Luckily it’s not usually dangerous and most people who contract it make a full recovery within two to three days.
Let’s just say it’s a very unpleasant experience.
The norovirus causes 19 million to 21 million illnesses every year, The outbreaks are usually in crowded environments like nursing homes, day care centers and cruise ships. Young children and the elderly tend to suffer the worst effects. The norovirus also causes 570 to 800 deaths each year, according to the CDC.
What’s the best way to prevent it? Sufficient heating, chlorine-based disinfectants and polyquaternary amines, and washing your hands if you’re serving food. There’s also a vaccine, developed by the Japanese, in the human testing phase.
But even with the vaccine, I’d still rather not unknowingly consume fecal matter in the first place.
In my case, things weren’t handled this way, so I got sick. Not only that, but my entire family got sick as well. It was a terrible experience, with both my wife and I feeling so weak that comforting our sick son seemed to take everything we had.
On the bright side, it’s a great way to lose weight if you’re looking to do so. In the span of 24 hours I managed to lose five pounds. Of course, as soon as my appetite returned, I started working on putting those five pounds back on.
So next time you eat out, make sure you pay attention to how your food is being handled. It may save you a few days of misery.
Follow Joseph on Facebook

Tuesday, August 23, 2016

Cultured meat is a future feast

Say hello to the future. The development of cultured meat
opens the door to solving many of the world's food problems
as well as environmental issues.
It might have sounded like science fiction just a few years ago, but today it seems we are just few years away from having meat and leather that don't require an animal to be killed. These cultured meat and leather products are grown in just a few labs right now, but one day will be mass produced in large facilities.

This may sound unlikely to many people, especially the detractors who point at the very first lab-grown burger that presented to the public just three years ago. That burger was cooked and eaten at a news conference on Aug. 5, 2013, to a mixed reception. But the biggest news wasn't that the burger was grown or that it tasted like a real burger, but that it cost $325,000 to produce.

Even today, when it was reported that the same burger would cost $11.36 to produce in 2015, a drop of $324,988.64 in just two years, detractors still point at the obsolete $325,000 price tag.

But despite the semi-skeptical reception and purveying anti-biotech attitude of many in the public, lab-grown animal products are still moving forward. The latest development being the announcement that Brooklyn-based startup Modern Meadow, who on June 28 secured $40million in Series B Round funding, bringing their total funds raised to $53.5 million.

“Modern Meadow harnesses the combined power of design, biology and engineering to change the way we think about materials, unlocking the capabilities of nature. Leather, which represents a $100 billion raw material market, has always been prized for its beauty, functionality and enduring status, according to Modern Meadow CEO and co-founder Andras Forgacs. “Today, as a co-product of the meat industry, it is subject to fluctuations in availability, quality, price and growing demand. At Modern Meadow, we’re re-imagining this millennia-old material to create revolutionary new features without harming animals or the environment.”

Millions have also been invested in research and development for other companies, like Memphis Meats, a meat-growing startup.

According to Memphis Meats CEO and Co-founder, Uma Valeti, their goal is to have the meat available for retail by 2021.

Cultured meat is expected to have a widespread impact. It's being touted for producing as little as 4 percent of the greenhouse gases produced by livestock, which is a positive for many of those concerned by environmental issues. This impact would also reach water bodies, which would be less susceptible from runoff that includes animal feces. Another big plus is that it would only require a fraction of the land required for cattle.

One example of the impact that raising livestock can be seen in our neighbors to the south in Central and South America. Since 1960 more than a quarter of rain-forest has been cleared for raising cattle and 70 percent in Costa Rica and Panama has been destroyed in conversion to rearing livestock, while in Brazil 40 percent of the land has been cleared for beef production, according to research biologist Brian J. Ford.

The livestock sector consumes 8 percent of all the fresh water in the world and occupies almost one-third of the world’s surface that isn't not covered by ice and permafrost. It also contributes 18 per cent of greenhouse gases to the atmosphere.

Then, there's also the idea that it could have a significant impact on the scarcity of food in the world. We are looking at a world population that is expected to keep increasing at least through 2100. Even though worldwide birthrates are declining — the average was 5.0 births per woman in 1960 and 2.5 births per woman in 2014, according to the World Bank — the population is living longer. Women in the U.S. alone are expected to have an average lifespan of 89-94 by 2050, according to the MacArthur Research Network, men lag behind at 83-86 years, according to ABC News. So, even though the number of people entering the world is decreasing, the rate at which people leaving it is decreasing as well.

Essentially, that means there's going to be a lot of poor people in the world to feed and lots of people see cultured meat as the way to do that.

But is it safe? Some people are concerned about it being unnatural and whether it's genetically modified.

According to New Harvest, cultured foods are unnatural in the same way that bread, cheese, yogurt, and wine are. Like those foods, cultured meat involves processing ingredients derived from natural sources. They also claim that production of cultured meat is less unnatural than raising farm animals in intensive confinement systems, That's because in intensive farming systems use synthetic hormones, and artificial diets made up of antibiotics and animal wastes. Furthermore, the conventional production of meat has led to a number of health and environmental problems, including high rates of heart disease and food-borne illness, as well as soil and water pollution from farm animal wastes.

What kind of impact will it make? Likely a gradual one. After all, considering the power of industry lobbyists in the U.S., you can expect it to be tied up for years even after it's on the shelves elsewhere.

And why wouldn't the industry fight? You are looking at an industry that directly employs 482,100 workers in the U.S. who have combined salaries of more than $19 billion, according to the North American Meat Institute. While that's by far not the largest industry in the U.S., the people it employs would still have to find something else to do.

As the many people who have lost manufacturing jobs over the 20th century can say, technology changes things. A lot of those changes lead to at least temporary job loss. It's one of those things that comes with modernization. People, especially those who have their livelihoods tied to a given field, will resist.

But supply and demand will inevitably reign supreme and the majority of people will go for what's cheaper as long as it doesn't taste bad. That's just how the world goes.

This isn't to say you shouldn't expect animals to not be still raised for food. Except with jacked-up prices for “real meat.” Same product, different marketing strategy.

Whatever happens, it looks like cultured meat is coming and it's going to change the world as we know it.

Follow Joseph on Facebook or Twitter.


Friday, August 19, 2016

Golden rice may be Vitamin A jackpot

 July 27, 2016
Ordinary rice to the left, golden rice to the right.
Originally printed in the Batesville Daily Guard
Ever hear of Vitamin A?
Sure you have, it’s up there with Vitamins B, C, D, E and K as being essential to a healthful life.
Vitamin A plays a critical role in the maintenance of the body in regards to vision, neurological function, healthy skin, building strong bones, gene regulation, cell differentiation and immune function. It is an antioxidant, thus is involved in reducing inflammation through fighting free radical damage. A high antioxidant diet is a way to naturally slow aging.
The best sources for Vitamin A are eggs, milk, liver, carrots, yellow or orange vegetables such as squash, spinach, and other leafy green vegetables.
But in many parts of the world, many of these things are unavailable in the necessary quantities. This is especially true in areas where the overpopulation and poverty are the norm. The most vulnerable people are the children of Africa and Southeast Asia.
There is one food that is widely available in these parts of the world, though: rice.
The problem though is that rice doesn’t have enough vitamin A to be effective. Naturally, anyway.
That’s where Golden Rice comes in.
Golden Rice is a genetically modified organism. Unlike regular rice, it carries beta-carotene, a major source of Vitamin A, which gives it the color for which it’s named. Like many GMOs, it contains genes that don’t originate in rice. The genes come from daffodils and a bacteria known as Erwinia. I know the word “bacteria” sounds scary to people, but remember, bacteria are on the most part tiny, tiny plants. Like plants, some bacteria are beneficial to us and some are bad for us. Luckily, golden rice has passed the safety standards and is safe for human consumption, like most GMOs on the market.
Clinical trials with adult volunteers from the U.S. concluded that “beta carotene derived from golden rice is effectively converted to vitamin A in humans,” according to the American Journal of Clinical Nutrition. The American Society for Nutrition said that “Golden Rice could probably supply 50 percent of the Recommended Dietary Allowance (RDA) of vitamin A from a very modest amount — perhaps a cup — of rice, if consumed daily.
It sounds good, right? It’s even got the support of the Bill and Melinda Gates Foundation.
Well, instead of sounding like a way to help millions, for many, it was akin to opening Pandora’s Box.
Many anti-GMO activists, particularly Greenpeace, have made it their mission to prevent Golden Rice from being planted by farmers in Vitamin A-poor parts of the world. Aside from spreading conspiracy theories about biotechnology companies, particularly Monsanto, they also attack the plots where the rice itself is grown. In 2013 an trial plot of Golden rice was uprooted by a gang of protesters in the Philippines, claiming that U.S. corporations were only seeking profit.
But why the resistance?
Greenpeace claims “... GE ‘Golden’ rice is a proposed but not practically viable crop solution that has never been brought to market. It is also environmentally irresponsible and could compromise food, nutrition and financial security.” Of course, they never offer any evidence to support their beliefs. Instead, we get inaccurate claims that farmers can’t “save their seeds” or “the rice will contaminate existing species.” 
Greenpeace has already been taken to task over this by 110 Nobel Prize Laureates in a letter, pleading with them to stop with the fear mongering.
Greenpeace’s response: The Nobel Prize Laureates didn’t offer “relevant expertise.”
Unfortunately for Greenpeace, nobody aside from anti-GMO activists are getting on board with them. Farmers associations in Nigeria support moving ahead with the cultivation of the rice, as does the Philippine Rice Research Institute. Anti-GMO activists accuse the groups and governments supporting Golden Rice consumption as being “bought by corporations” and have voiced support for radical groups that attack the farms where the rice is grown, destroying the crop.
So does the radical anti-GMO crowd offer an alternative solution?
“Plant sweet potatoes.”
Follow Joseph on Facebook or Twitter.

Wednesday, August 17, 2016

Whose law?

This statue of Baphomet is looking for a home and
Arkansas is on the list.
Just slightly more than a week ago, Arkansas State Senator Jason Rapert announced on his Facebook page that “After several months of waiting for the American History & Heritage Foundation attorneys to finish application paperwork for the Arkansas Secretary of State, I am advised they are now submitting the paperwork to begin the process of site selection approval for the Arkansas Ten Commandments Monument!”
Thus marking a milestone in a process that has caused a great deal of debate in Arkansas where the separation of church and state should be. Historically speaking, a Ten Commandments Monument would be a violation of it. 
Not that lawmakers don't try.
Oklahoma famously passed a law allowing for privately-funded religious monuments on the state capitol grounds. The kicker was, that it was open to all religions as long as they could afford to pay for their own monuments. 
This backfired on them when the Temple of Satan did just that. 
Of course, the Ten Commandments in Oklahoma came down pretty fast after lawmakers learned there was no way to stop the Temple of Satan to put up a monument of its own on state capitol grounds. After all, it was open to all relgions.
In Arkansas, state lawmakers are trying to avoid what happened in Oklahoma by proclaiming that the Ten Commandments monument is not a religious monument. The argument that is being used is “the Ten Commandments aren't a religious document, but the historical foundation of our laws.” 
The Arkansas Ten Commandment Display Act states:
“The Ten Commandments represent a philosophy of government held by many of the founders of this nation and by many Arkansans and other Americans today, that God has ordained civil government and has delegated limited authority to civil government, that God has limited the authority of civil government, and that God has endowed people with certain unalienable rights, including life, liberty, and the pursuit of happiness;
“In order that they may understand and appreciate the basic principles of the American system of government, the people of the United States of America and of the State of Arkansas need to identify the Ten Commandments, one of many sources, as influencing the development of what has 5 become modern law;
“The placing of a monument to the Ten Commandments on the grounds of the Arkansas State Capitol would help the people of the United States and of the State of Arkansas to know the Ten Commandments as the moral foundation of the law.”
The problem is that the Ten Commandments are mostly religious and moral rules, not the foundation of Western law. Even the Act itself sounds religious in nature as it makes mention of God, divine endowments and morality — none of those things that most people believe that the state should be involved in. 
“What?” you might say. “But every elected official says they are historical!”
Well, that may be so but the thing about elected officials is that they often tell people what they want to hear, especially if it's their base. In Arkansas, as well as most of the South, religious voters are a very big base, if not the biggest. Their support is what keeps many lawmakers in office. Things like the Ten Commandments Monument makes the more fundamentalist-leaning voters happy.
The problem is, most of the things on there aren't crimes, nor were they crimes in the times of the Founding Fathers of the U.S.
Of the Ten Commandments, there are only three that are actually crimes: Thou shall not murder, steal or bear false witness. The rest are basically good advice (don't cheat on your spouse, don't be a jealous jerk) or rules pertaining to practicing the faith (no graven images, no taking God's name in vein) which almost all self-proclaimed religious folks break on a daily basis anyway.
On top of all that, of the things that are actually illegal in the Ten Commandments, all of them were illegal before Judaism and Christianity had their respective boom periods in the Mediterranean region, which is pretty much the cradle of Western civilization.
Chances are, those things were probably fit for some sort of punishment even before the time of writing. But it's one figure, Hammurabi, that made the law famous. 
Hammurabi, who lived from 1810 BC to 1750 BC. He was the sixth king of the First Babylonian Dynasty, reigning from 1792 BC to 1750 BC. Written in stone, his famous code had 252 laws. Among them were rules against stealing, murder and bearing false witness. Also among them is the famous “eye for an eye” rule. 
The punishment called for by Hammurabi's Code was very uneven and depended on the perpetrator's social status. A poor perpetrator would always face harsher punishment, often death, while the rich criminal often paid only a fine. 
Fortunately, no such uneven dispensation of justice exists in our society today. (For those of you that are a little dense, that's supposed to be sarcasm.)
But, even Hammurabi can't be given credit for first transcribing those three laws. He was late by a couple of hundred years. The first person, as far as we know, to actually have laws against murder, stealing and false witness transcribed is another Mesopotamian — Ur-Nammu, founder of the third Sumerian Dynasty. Ur-Nammu was believed to have lived around 2030 BC. 
Moses' birthdate, on the other hand, is believed to have been around 1400 BC, hundreds of years after Hammurabi's and Ur-Nammu's deaths. By the time Moses came along, laws against murdering, lying and bearing false witness were not only standard across the region, but across the known world as well. 
Thus there's just not a valid case that the Ten Commandments are the basis of Western law. By the time that the Romans started converting to Christianity, such laws were already put in place centuries before by polytheists. 
Now one can try to make an argument that the Ten Commandments are the “moral” basis for Western law, but even after the the Christianization of the West, morality was a very flexible thing. After all, it's doubtful that Moses would approve of the Trinity, Saints, crosses or various iconography that we see in regards to Christianity today.
Other things, like coveting and honoring one's parents, have never really been addressed by Western law to any meaningful extent. Looking at the leaders of old, how many of them launched wars out of greed or dishonored their parents? A lot.
So where does that leave us?
It leaves us a system based on man's law. Our secular law is not only supposed to protect Christians, but hold them at equal footing with the likes of Muslims, Atheists and modern Pagans. If we let one or the other take control of the law, there'd be nothing for anyone else.  
Sen. Rapert might want to take this into consideration. After all, he's one of the beneficiaries of secular law, which allowed for his ancestors to pass their faith onto him. If the U.S. was founded on religion, it's possible he wouldn't be following the “right” form of it. 

Like Joseph on Facebook or Twitter

Thursday, August 11, 2016

Something for everyone

August 10, 2016
Originally published in the Batesville Daily Guard
It’s time for the Olympics again and thank goodness they’re in the western hemisphere this time. Unlike Beijing, London and Athens, those of us in the States can actually see these games live instead of at 3 a.m. in the morning or on a taped replay.
Sure, most of the sports aren’t necessarily the sort that appeal to “fans.” After all, I don’t know of any country where archery or fencing are sports that millions, or even thousands, of people tune into or fill up stadiums to watch, aside from the parents of the participants.
Now, there are several sports that have rather large fan bases like basketball, soccer and boxing. But two out of three of those sports lack the pros that draw the eyeballs to the screen.
And while baseball makes a return in 2020 during the Tokyo games, it will be rather limited and probably not have the presence of the professional players either.
Soccer, which is the most popular sport in the world, is restricted to having all but three players on a team be 23 or younger. This only allows spots for a few pros, the rest being filled by amateurs.
The lack of professional players isn’t necessarily a bad thing. We’ve seen stars rise from the Olympic ranks before. Arkansas’ rather troubled son, Jermaine Taylor, is an example of someone who made their name at the Olympics and would go on to success as a pro boxer.
Of course, baseball and soccer already have their own global events, the World Classic and FIFA World Cup. 
As far as basketball goes, this is its world event and Team USA shines every time. From the “Dream Team” first forming when the door was open for professional players in 1992 to accumulating five gold medals out of the last six Olympics. This year it looks like they’re well on their way to getting six out of seven.
But I digress, this is about the Olympics. This is sports for everybody.
It’s full of sports that we would probably never watch. Examples of this include handball, water polo and field hockey. Others are just kind of odd, like walking, trampolining and the equestrian events, which I’m still not sure how they actually fit in as an olympic sport (but they do make for some interesting pageantry). A few of them are games that many of us play in our basements, if we have them, and during family get-togethers, like table tennis and badminton.
Of course, there’s also some big names people are tuning in to see. You got the NBA players not only representing the US, but also the teams of their home countries. You got probably the most famous figure in American women’s team sports, Hope Solo, playing for the U.S. women’s soccer team. You also have Gabby Douglas, the teen gymnast who stole America’s hearts in 2012, returning to capture more gold now finding herself out of contention for a medal because of the “two per country” rule.
But, probably most of all, people are tuning in to see swimmer Michael Phelps, the winningest Olympian ever. Millions probably tuned into the opening ceremonies just to see him carrying the American flag. It was the first time he’s taken part in the opening ceremonies and it is also his last Olympics. As of the time of this writing, he’s collected his 18th gold medal and is probably on his way to winning more. It’s hard to imagine that 31 is retiring age, but for an athlete at that level he’s already a senior citizen.
But the special thing about the Olympics is that from air gun shooting to the triathlon, we care about them all.
With this week and next, we can expect new faces to capture our hearts. That’s part of the appeal of the Olympics. While professional sports fans may dismiss it, the rest of us tune in, cheering on our countrymen on a larger stage. A stage where unlike politics, economics and sociologically, we can actually prove we’re the best at something — on a level playing field to boot.

Follow Joseph on Facebook or Twitter.