How I Failed and What I Learned

I can’t stop thinking about how I botched an assignment.

I know it’s not popular or common to talk out loud about your own shortcomings or how you failed at your job. While internet memes and online articles tell us to embrace failure as a learning opportunity, is seems we are rarely encouraged to actually share these shortcomings.

So, at the risk of deterring you from reading further, I want to tell you how I recently failed, and what I learned from it.

I didn’t deliver the product the client wanted. I struggled to communicate what their business was about. I realized afterwards what I was missing: the story.


The Story of How I Failed

The Story of How I Failed

The client’s business had changed its direction since it first started, and they needed to update their website.

In a meeting, they described the business as it currently existed and what they were trying to achieve. They wanted the copy to walk a fine line between corporate and familiar – refined but still a little sexy.

Sexy B2B copy? Quick, witty, punny, smart – those are corporate tones I love to play with. But sexy? For whatever reason I was so hung up on this idea. In draft after draft, defining this tone became my primary objective.

And that was my big mistake. I was too focused on how to shape this voice. I felt so lost in this goal that I ended up creating copy that

  1. wasn’t sexy; and
  2. didn’t sound like me or the company at all.

It wasn’t until after I spoke with someone else in the company about how the business started that I realized this copy was all wrong. Had I just focused on telling the story from beginning to now, I would have naturally found its voice.

I was so involved with nearly re-branding the business that I lost sight of the goal: to communicate what the client wants customers to know about their business.

What I Learned From My Failure

What I Learned

Writing good copy is important, but the craft shouldn’t distract from the goal. I took it upon myself to design a brand new voice for the client when they didn’t need one.

I also need to allow space for the story to speak for itself: leave out the flowery language and clever wordplay. More often than not, simple and straight-forward is the best way to go.

And while I usually embrace this method of writing, I was trying too hard to write something new, to be something that isn’t myself, and that’s not why I was hired.

The other important thing I’ve taken away from this experience? Being myself is the best thing I can be. That’s why I was hired in the first place. Had I just written the way I normally write, the client and I would have worked together to revise it, and the right voice would have emerged.

There’s just no sense in trying to be something I’m not when I’m hired to be me.


If I haven’t already scared you away, you should know this: I pride myself on being a good listener. It’s what I enjoy about copy writing – listening to what people have to say, and then translating it into what they need.

That’s what I want to do for you. Sometimes it’s not enough to tell your audience about the problem you solved with your product – you also have to tell them about where you’re going next and what you can do for them. Here’s how I help you do it.

Can We Save the Cows and Eat Them, Too?

Another report has been released stating that climate change will be the end of us if we don’t act fast and drastically. Reducing our carbon footprint is imperative to slowing the effects and prolonging humanity’s time here on Earth.

One of the quickest choices that many promote as a way for individuals to reduce their impacts on the environment is eliminating meat – specifically beef – from their diets.

Can We Save the Cows and Eat Them, Too?

Stop Eating Cows

In order to produce beef, cows require 28 times more land, need 11 times more irrigated water, produce 5 times more greenhouse gases, and use 6 times more fertilizer than pork, chicken, dairy, or egg production. In the United States, beef requires 7-10 kilograms of feed to produce 1 kilogram of edible protein. 13 billion hectares of land are deforested every year in order to create space for both livestock and feed production. Approximately 65% of livestock greenhouse gas emissions are from beef.

And besides the resources it takes to maintain cows, the World Health Organization (WHO) released a report in 2015 to say that meat isn’t terribly great for our health.

So it follows economic principle that by decreasing demand, the supply will decrease, right?


Incentivizing Everyone in the Foodchain

The recently released report encourages governing bodies to implement stronger regulations in order to substantially curb the impacts of global warming.

So how does that happen with the beef industry? What happens to the cows? What happens to the farmers? What happens to everyone whose livelihood depends on beef?

People and businesses have dedicated themselves to supporting the beef industry. As a result, some could left with a limited set of skills, while others would be left with a failing business, both of which are vulnerable to the ever-changing landscape of consumerism and innovation.

So how do we incentivize people and businesses to change their practices?

This would be the perfect time for governments to set an example of how to help their people transition from a toxic industry into a thriving one. It won’t be easy to simply shut down farms and processing facilities and convert them into crop-producing farms.

What is the environmental impact of raising beef?

Healthier Cows, Healthier Planet

If the majority of the beef-eating world were to stop eating beef tomorrow, there would still be cows. It isn’t really responsible (or possible) to just let the cows loose on the land, nor can we slaughter them all at once.

One option for farmers raising beef is to convert their land to pastures – grazing cattle require less than 1 kilogram of protein from feed to produce 1 kilogram of edible protein in the form of both meat and dairy. The resulting product from grazing cattle is also higher quality than industrially farmed beef. Alternatively, they could convert their land to raising crops (hydroponics or otherwise) instead of cows.

All of this is easier said than done, of course. Farmers would need to off-load equipment, purchase new equipment, re-train employees, and meet a new set of standards. They would essentially be starting a whole new business.

They could, however, at least work towards healthier, more ethical forms of farming. Tech start-up Connecterra developed the app Ida for dairy farmers to track the health of their herd. By tracking patterns and behavior, the app can provide insights into abnormalities in order to prevent widespread illness. Connecterra has also recently teamed up with the Internet of Food and Farm to improve sustainable farming practices for The Happy Cow Project.

Final Thoughts

States are making a pledge for renewable energy, and cities are encouraging zero-waste efforts, both of which are admirable and important. But it’s time for the major players to step up and commit to decreasing our contributions to global warming. Creating incentives for beef farmers is a great place to get the moovement going.


LIVESTOCK AND LANDSCAPES” Food and Agriculture Organization of the United Nations

Raising Beef Uses Ten Times More Resources Than Poultry, Dairy, Eggs or Pork” Rachel Nuwe, Smithsonian Magazine Online, July 21, 2014.

More Fuel for the Food/Feed Debate” Food and Agriculture Organization of the United Nations

Is the Livestock Industry Destroying the Planet?” Alastair Bland, Smithsonian Magazine Online, August 1, 2012.

Edible Insects – Future Prospects for Food and Feed Security.” Food and Agriculture Orgainization of the United Nations

Land, Irrigation Water, Greenhouse Gas, and Reactive Nitrogen Burdens of Meat, Eggs, and Dairy Production in the United StatesGidon Eshel, Alon Shepon, Tamar Makov, and Ron Milo, Proceedings of the National Academy of Sciences, Summer 2014.

Tackling Climate Change through Livestock” Food and Agriculture Organization of the United Nations


Food Safety: It Just Isn’t Sexy

Slave labor. Fair trade. Organic. Free-range. Cruelty free. Non-GMO. Gluten free.

At least one of these terms has interested you in food ethics or production. These are all attention-grabbing, infographic-friendly terms and subjects.

They come with images of blue skies and green grass, maybe just a couple shades off from the default Windows wallpaper of rolling green hills. You can already imagine the lens flare and the smiling face holding two palmfuls of dirt – there may even be a sprout popping up from the middle.

But what comes to mind when you think about food safety? Stainless steel counters and people wearing hair nets? Lab coats? Florescent lights? Computer monitors displaying graphs and molecular structures?

Sterling Schuyler Food Safety

Food safety simply isn’t sexy. In fact, anything that brings scientists to mind when it comes to food just doesn’t seem natural.

But then there’s an E. coli outbreak and the first question is always: how did this happen?

This isn’t to imply that organic romaine lettuce or free-range chickens are to blame. It’s important to care about how we impact the environment with something as ubiquitous as food production. It’s equally as important, however, to care about the standard operating procedures (SOPs) that keep that food safe for consumption.

In this most recent outbreak of E. coli in romaine lettuce, the first reported illnesses started at the end of March 2018, with an official statement from the Center for Disease Control (CDC) released on April 10, 2018. And while on May 16, the CDC announced that “the last shipments of romaine lettuce from the Yuma growing region were harvested on April 16,” the CDC declared the outbreak “appeared to be over” on June 28, 2018.


Are you still with me here? That’s:

  • 79 days when consumers weren’t sure if it was safe to eat romaine lettuce
  • at least 36 days that restaurants, suppliers, and grocers had to figure out an alternative source for their romaine lettuce
  • over 79 days both the government and the farmers spent trying to determine the source of the outbreak
  • 210 people were infected with that strain of E. coli in 36 states in those 79 days

Doesn’t that seem like a terribly ineffective process? Where is the outrage?

What seems most outrageous about the situation is that the E. coli wasn’t traced back to one specific farm lacking SOPs or over-looking a bad batch of lettuce. The strain of E. coli was found in the canal water, which feeds the whole growing region. According to the final update provided by the CDC,

“[The Food and Drug Administration] is continuing to investigate the outbreak to learn more about how the E. coli bacteria could have entered the water and ways this water could have contaminated romaine lettuce.”

Knowing where your food comes from is important. Having relationships with farmers and butchers can deepen our understanding of the food we consume.

But the industrialization of food isn’t an aspect of life we can ignore, and proper food safety practices are essential to our well-being. In fact, when practiced responsibly, we can enjoy produce in its freshest, most healthful state all year round. People all over the world can experience flavors from different countries and can even incorporate foreign flavors into local cuisines.

Awareness and education are how we learn to eat healthy and sustainably. From food deserts to world peace, we can truly make the world a better place with the food we choose to put on our tables.

How does blockchain technology help the food industry?

Blockchain, blockchain, blockchain. It’s all anyone wants to talk about (including myself, to some extent).

In just the past week, people have sent me two articles about blockchain and the food industry: The Next Web wrote about Australia recently shipping almonds to Germany with blockchain technology, and Forbes wrote about some of the major players working to implement blockchain technology in the food manufacturing industry.

Clearly businesses see potential benefits from using blockchain technology – but what are they?

Blockchain Technology and the Food Industry

Cut the Red Tape, Reduce Waste, and All the Other Things We Associate with Bureaucracy

One of the benefits frequently highlighted is the reduction in bureaucratic waste. By replacing paperwork and verification steps with blockchain technology, businesses can reduce costs and save time.

The assumption here, of course, is that everyone in the supply chain uses the same blockchain software, let alone using it at all. These articles talk about huge companies that have the resources to do these trial runs, which is great for those who can’t afford to test the waters. But everyone involved in the experiment is testing the same software.

I realize that may seem like an obvious statement and necessary for product development, but with the way that the term “blockchain” is so freely used, articles make it sound like something that is universally compatible and accessible, as if it’s a new technique, rather than a new technology.

As far as I’ve seen, different companies are developing solutions (like IBM and Microsoft), which means it’s unlikely they’ll speak to each other. Why would one business want to be compatible with another business’s software for the same purpose?

It’s the same type of dilemma that the medical industry faced with digitizing medical records. Sure, it technically eliminated the need for fax machines, but because none of the solutions speak to each other, hospitals and medical offices are still using fax machines. (For more about this, listen to Vox’s The Impact), or read more about it in my post about blockchain and the medical industry.


E. Coli, Salmonella, and Other Scary Words That Make Your Stomach Crawl

Another the common example is that of foodborne illnesses. If there’s an E. Coli outbreak, officials will be able to trace the source in a matter of seconds and identify where the problem originated.

For example, take the romaine lettuce E. Coli outbreak in the United States in spring 2018. Illnesses were reported at the end of March, and an official statement was released by the Center for Disease Control (CDC) on April 10. Between April 18 and May 16, the CDC suspected that the contaminated lettuce came from the Yuma region and continued to investigate. On May 16, they released a statement to say that all contaminated lettuce from the Yuma region was last harvested on April 16.

In this instance, blockchain technology could have potentially stopped the contaminated harvesting sooner. Instead of receiving enough evidence after the last harvest date, the CDC could have gleaned that information sooner with the near-immediate retrieval of information that blockchain technology provides.

But where do we lag the most in food illness outbreaks? Is the time businesses spend trying to determine where a shipment came from? Or is it more about the resources available to the CDC and the constraints of simply being human?

The first reported illness from this outbreak was on March 22 – 19 days before the outbreak was officially declared. And even when the CDC released their official statement on April 10, they were not sure of the cause. Romaine lettuce is not pegged as the culprit until April 13.

The process of collecting data to prove an outbreak takes time and not necessarily something that can be resolved by blockchain. Take it one step further – can blockchain even help prevent the outbreak from happening in the first place?

Final Thoughts

Overall, I’m excited about blockchain technology. I still love seeing all the new trials and applications of it. I’m just hesitant to put all my eggs in this trustless basket for now.

Blockchain for the Medical Industry? Maybe.

This morning TheNextWeb posted an article suggesting that “blockchainification” is just one more possibility looming in the future of medicine.

It is certainly possible – blockchain is the cool new kid that everyone wants at their birthday parties. But how will it serve the medical community?


Making communication more efficient

This article immediately reminds me of an episode of the podcast The Impact, which explores the human consequences of United States policy-making. Their first season focused on health care policy, during which the host Sarah Kliff investigated why the industry is still so heavily dependent on fax machines.

In Barack Obama’s first presidential term, the government “spent upward of $30 billion encouraging American hospitals and doctor offices to switch from paper to electronic records.” It was massively successful, in the sense that nearly 85% of hospitals were using electronic medical records by 2015.

It was unsuccessful, however, in deterring dependence on fax machines: while hospitals now have digital records, they all use different software programs that don’t speak to each other. As a result, hospitals and offices still have to use the fax machine.

Businesses found an opportunity to monetize this digital revolution in the medical industry. Ultimately, the ease of sharing medical records is not in the interest of most companies (and hospitals) because it makes it easier for the patient (or consumer) to switch providers.


More technology, more problems

Which brings me to my original concern with blockchain technology. It all sounds well and good to have this incorruptible digital ledger, but can it be read by any software? Or will everyone in the supply chain need to have the same program?

While it seems to be in everyone’s interest to exchange research and ideas, it is not necessarily a profitable business practice.

Another obstacle that “blockchainification” may face, according to TheNextWeb, is recently-implemented General Data Protection Regulation (GDPR). It would seem, however, that blockchain technology may actually work in favor of GDPR because it has the potential to give data control to the patient, rather than locked in the hands of the healthcare provider.

There are still significant questions like this that need to be explored before blockchain is meaningfully implemented throughout industries.