The Apologetic Billionaire

It was confirmed today that Microsoft has acquired Mojang AB, the Swedish company behind the hit game Minecraft, for $2.5 billion. Seems like a cause for celebration, especially for Markus Persson, the 35-year-old creator and majority shareholder known in the gaming world as “Notch,” right?

Not so much. He seems relieved, but perhaps also feels a bit hypocritical. Minecraft got too big for him to handle. More than anything else, he hated the hatred he experienced as the game grew. He took it very personally, it seems.

In June, he tweeted this:

“I’m not an entrepreneur. I’m not a CEO. I’m a nerdy computer programmer who likes to have opinions on Twitter,” he said in a post today, offering an explanation, and in many ways an excuse, for why he had to sell. (more…)

Another Snapchat Tale

It doesn’t seem like any of Snapchat’s hiccups backfire on them the way they would on other companies. A benefit of having a teenage-dominated user base? Perhaps. Today they pulled another pretty slick move.

But first a little recap:

Last New Year’s Eve, Snapchat was hacked, compromising 4.6 million of its users. Keep in mind the sensitive nature of the messages sent on Snapchat. It was, legend has it, built for sexting, so it goes without saying that most of the users wouldn’t be pleased if their messages were exposed. As it turns out, they didn’t seem to care at all. The media buzzed about it, but it had little to no impact on Snapchat or its users.

Critics questioned the ability of the then 23-year-old (now 24) CEO Evan Spiegel to lead the company. Eventually that died down.

Fast forward to May. Emails from Spiegel’s time as the social chair of Stanford’s Kappa Sigma fraternity were leaked to Valleywag. The emails were, let’s just say, offensive, stereotypically fratty on steroids, enough to lead the the decline of Snapchat, or at least the removal of Spiegel.

[Enter critics]

Welp, he’s still there. (more…)

Decentralization of Media Power

Editing or censoring? Editing, argued Dan Gillmor in a recent post for The Atlantic. By removing photos and videos of James Foley’s beheading from their channels, Twitter and YouTube actively refused to participate in the spread of murderous propaganda carried out by irrational religious fanatics. Because (as far as we know) they were not ordered to remove the content by a government, it should be considered editing, not censorship, Gillmor said.

I was onboard at that point, following along, nodding my head, briefly pausing now and then to consider semantics.

Did Twitter and YouTube do the right thing? I think they did. Should we consider it editing instead of censorship? I don’t take issue with the distinction since I do usually associate censorship with government censorship and immediately think of China and Baidu, or a room full of North Koreans staring at Google’s homepage, keyboards untouched, while the VICE cameras roll, knowing full well that hitting “enter” would result in life imprisonment, maybe even execution (at 15:45). Censorship has a sinister connotation, and that just didn’t seem to be the case in this particular instance. However, if someone were to disagree, they’d be justified in doing so since censorship, by definition, does not require government intervention.

There was a reason Gillmor distinguished between editing and censoring. He wanted to address the precedents set when we appoint these mammoth platforms as unchecked gatekeepers. Usually, in less intense circumstances, the decision-making process would be more akin to editing than censoring.

When we allow the likes of Twitter, YouTube, and Facebook to dictate what information we consume, we give rise to “a concentration of media power that will damage, if not eviscerate, our tradition of freedom of expression.”

That’s where he lost me.

What tradition is that? It’s a good ol’ days way of looking at an issue. Oh, how we miss the days when a few major newspapers and three television networks from which we consumed all news had no agenda and presented an unfiltered worldview! (more…)

The Routine Gene – Can Productivity and Creativity Coincide?

You know what IQ is, but do you remember what it stands for? Intellectual quotient. Some say intelligence quotient. How about EQ and CQ? They’re real acronyms, I promise. Dr. Tomas Chamorro-Premuzic, a Professor of Business Psychology at University College London, defined them less than a week ago in the Harvard Business Review.

They respectively stand for emotional and curiosity quotient. In a nutshell, our emotional quotient dictates our ability to adapt to complex interpersonal situations and handle stress and anxiety. “People with higher EQ tend to be more entrepreneurial, so they are more proactive at exploiting opportunities, taking risks, and turning creative ideas into actual innovations,” Chamorro claims.

What does curiosity quotient mean? What does it mean? What does it mean? As you may have guessed, our curiosity quotient is measured by how inquisitive and open to new situations we are. People with higher CQ dislike routine, but embrace ambiguity and have a knack for finding simple solutions to complex problems.

This got me thinking: people with higher EQ are more entrepreneurial, and all of my entrepreneurial friends are intensely curious, which means they must have high CQ levels, which would mean they dislike routine. But they’re also incredibly disciplined. So do people with an entrepreneurial spirit really tend to dislike routine? In my own experience, I’ve found that while routine can be great for productivity, it certainly seems to stunt my creativity. It really is a double-edged sword. For example, I constantly change my work environment, especially when I’m writing, but tend to get up at the same time and go through the same routine every morning.

Confused by my own CQ, and obviously driven by it, I decided to ask some of my friends how they felt about routine.

BusinessAsUsual

Here’s what they told me: (more…)

Why An Eight-Hour Workday?

In honor of yesterday being Labor Day, let’s talk about a practice that’s often taken for granted, a standard so engrained in our culture that many of us grew up without ever second-guessing it: the eight-hour workday.

There’s nothing scientific about the eight-hour workday. No evidence exists to suggest that we are most productive when we work eight hours per day.

Then why is it the standard, the norm?

Origins

It all started during the Industrial Revolution with a man named Robert Owen (1771-1858), one of the early fathers of socialism. Working conditions, particularly those of factory workers, were so poor in Great Britain at the time that Owen felt a responsibility to implement sweeping reforms. His attempts at reform included raising the minimum age of factory workers to ten years old and reducing hours. A balanced life at the time, Owen thought, would consist of eight hours of work, eight hours of relaxation, and eight hours of sleep or relaxation.

EightHourWorkday

In an attempt to prove that factory owners could make a profit without treating employees like livestock, Owen bought a cotton mill in Scotland and began practicing what he was preaching. He increased the minimum age of workers, reduced hours, and offered sick pay.

It wasn’t until 1914 that Owen’s ideas took hold in the United States, and, believe it or not, by one of America’s greatest capitalists, Henry Ford. Ford, who is often credited as being one of the most important driving forces behind the creation of an American middle class, reduced hours and increased pay to $5 per day, which was unheard of at the time. Ford’s profit margins doubled within two years. (more…)

Jump

Cliff jumping in Waimea Bay this summer

Cliff jumping at Waimea Bay this summer

I’m a documentary fiend.

Last night I watched 180° South, the story of Jeff Johnson, who Esquire referred to as “Patagonia’s Pro Adventurer” (what a title!), emulating the spontaneous 1968 trip Patagonia founder Yvon Chouinard and Doug Tompkins, founder The North Face, took to Patagonia, Chile.

At the time, Chouinard was 29 and Tomkins was 25. They didn’t consider themselves businessmen. In fact, as they often say in the film, they thought of themselves as “dirt bags,” societal rejects.

They were living impulsively, deciding to make the six-month journey just two weeks before. It was an adventure that helped shaped who they are today and laid the foundations for both companies.

Granted, Chouinard and Tompkins made the trip in a different era, when a gallon of gas cost 34 cents and the unemployment rate was just 3.8%, but we too live in a different era. They had the freedom to be unconventional. For us, unconventional is actually the safest thing to be. (more…)

Technophobia

A recent Wall Street Journal book review began with the story of Johannes Trithemius, a German abbot and advisor to Emperor Maximilian I, who was concerned that Gutenberg’s printing press would have near-apocalyptic consequences, that the mass production of texts would shake the societal structures of the time. And it did, but looking back on it, there aren’t many who would argue that we’re worse off today because of it.

Trithemius

The book in review was “The End of Absence: Reclaiming What We’ve Lost in a World of Constant Connection” by Michael Harris.

Instead of embracing and celebrating the positive changes the Internet and connectivity have to offer, it seems he lives in fear that future generations will miss out on many of the important moments that allowed him to flourish. Harris believes we are living in our own “Gutenberg moment,” but what he may find is that while we are going through a transition possibly as monumental as the printing press, he’s playing the role of a modern-day Johannes Trithemius.

This is a never-ending cycle. I think we’re hardwired to be concerned about technologies of the time, while looking back on the technologies we grew up with fondly, and for good reason, because they helped us get to where we are today. In the future there will be plenty of people reminiscing about simpler times, when keeping in touch with friends was as simple as Facebook and Twitter and finding information required you to type it into a search engine.

In 1858, almost 400 years after Johannes Trithemius penned his concerns, The New York Times concluded that there could be “no rational doubt that the telegraph has caused vast injury,” claiming it was “superficial, sudden, unsifted, too fast for the truth.” Similar concerns followed the telephone, radio, and television. The Internet, and the technology we have to stay connected, is just the newest, and won’t be the last. (more…)