Does money or quality define seriously good art? And who said art had to be serious?

The clash of art and business is a mighty one. It’s one that hits close to home as well.

My mother wanted to be – and is, to be fair – a professional artist when she was growing up. But, like any responsible parent would do, she was told that the life of a professional artist is hard. So when she went to get a college degree she went practical and studied business.

Not a bad idea. In fact, a very smart one.

But years later, after a successful career in business that she gave up to raise my sister and I, she returned to art in a big way. She’d painted all this time here and there, giving paintings away to family and friends. A chance meeting with an old friend of my father lead her to a showing at a gallery, and a women’s workshop, and in Borders bookstores (you can tell how long ago this was). She blended her business sense with her art talent and sold paintings and then through volunteer work got an art director position crafted for her.

That’s rare. It’s hard to make art a career, but my mom, through a zig-zagged path, did just that.

But the legitimization of her passion for art came with its blend into her professional pursuits. The painting and talent were the same for years. She became a blip on others’ radars and that’s how she got the business cards that labeled her a professional artist.

So is that was makes art good? It’s monetary value and it being sold? Is that even the purpose of art – to be good?

Poptimism discusses this in the realm of music and music criticism. The New York Times recently called it “a studied reaction to the musical past. It is … disco, not punk; pop, not rock; synthesizers, not guitars; the music video, not the live show. It is to privilege the deliriously artificial over the artificially genuine … Poptimism wants to be in touch with the taste of average music fans, to speak to the rush that comes from hearing a great single on the radio, or YouTube, and to value it no differently from a song with more ‘serious’ artistic intent. … In this light, poptimism can be seen as an attempt to resuscitate the unified cultural experience of the past, when we were all, at least in theory, listening together to “Sgt. Pepper’s” or “Thriller.””

The author then compares it to criticism and “good” art in other genres, finishing with “[n]o matter the field, a critic’s job is to argue and plead for the underappreciated, not just to cheer on the winners.”

My immediate thought is that Thriller was both widely popular and critically acclaimed, and that there are certain pop artists, like Beyonce, or Lorde, that still garner this kind of wide sweep. But being indie doesn’t mean you’re better, just like being widely bought doesn’t mean you’re better. And that in a world where we have access to just about everything, having a Thriller is less and less likely. It’s not that there’s no talent in pop music – it’s that everyone’s pop is different.

The discussion at NPR made me feel much more sane, noting “in a globalized, polycultural, multilateral, warming, mass-migrating world, we have urgent questions such as, ‘Where is the center?’ ‘Which information matters?’ ‘Who benefits?’ ‘What does that make me?’” It states my general thought in more precise language: the discussion of pop music, and thus what is popular, is so much more interesting and valid because “Where is the center?” is a question that’s so much more valid today than ever before. Your taste isn’t bound by your parent’s record collection and the Beta Max/cassettes/CDs your older siblings played in your shared room. It’s bound only by your imagination and your faculties with Google.

Pitchfork also adds a refinement to this and an ancillary concept: “The point is that popularity is a number, but “pop” is a concept. To its enemies it suggests a dystopian image of music served up like condensed food pellets from some uncaring hand, forced into our living rooms and offices, inescapable. To its friends it is something inclusive, a unisex, one-size-fits-all party smock, the thing that draws everyone to the floor.”

The part that’s the most interesting to me is that all of this discussion is based on a dynamic that I saw my mother grapple with when I was small, one that haunted me throughout all my internships where I was giving away my creativity for the pay of $0 and experience cents for months upon months, and even see now as a content creator in the big bad world: money makes art “real,” “appreciated,” and “professional.” And these critics, in their bellyaching about losing their thrones as the arbiters of what people buy under the guise of making the “underappreciated more appreciated” really are just falling into the trap that created the mega popstar in the first place – it’s all about valuing the monetary gains to prove the art you’ve made (or critiqued) as good. Even in the fight against just “cheering on the winners” they rate their success as critics as turning “losers” into new “winners.”

Citing the fact that poptimism – and by extension the popular – must not prove its significance (as its sales have done that) but stress its further impacts, Pitchfork makes the point that I think is the most important one that I’ve read. “I sometimes worry that serious music can only be served by serious talk, or worse, that people who like serious music can only have serious reasons for doing so. The truth is that you will probably meet just as many shallow people at a National show as you will at a Miley Cyrus show, the difference being that people at the National show are more likely to think they’re important, while people at a Miley Cyrus show are more likely to think they’re having fun.”

So does the world think that good art is valued by expertise in execution or especially good economic returns? In the abstract people would say quality, but in real terms, even the art critics allow their good work to be valued in monetary terms. Is that good? I doubt it, but I know that even after reading tons of art-flavored poptimism that would encourage me to lean Dali or Pollock rather than Woodberry, I love my mother’s paintings with a ferocity that only slightly outweighs how hard I bump “**Flawless” from time to time. And that’s OK, because I’m going to the Cyrus fan route and having fun.

What the Robocop reboot taught me about utopia

George Orwell, Aldous Huxley, Bill Gates, and Robocop have forever changed my view of utopia.

The irony of this is that the discussion of what utopia is strangely both apolitical and completely partisan and Orwell and Huxley juggle this dichotomy in both 1984 and Brave New World respectively. But other than expanding my 15-year-old mind to explore the idea of a utopian society, I’m not sure if I ever wrestled with what a real applicable utopia would look like – and if it would even be a good thing to have.

This train of thought started with a benign discussion of robots.

Mashable talked about how tiny robots can build big things, noting “SRI International, a non-profit research firm serving government and industry, has found a novel way to control tiny, low cost magnets via electromagnetic pulses delivered to them through contact with printed circuit boards,” and went on to say that SRI is expanding this so that these robots can manipulate tools and the research firms aims to “enable an assembly head containing thousands of micro-robots to manufacture high-quality macro-scale products while providing millimeter-scale structural control.”

Mashable said this was both “amazing, and a little bit creepy,” and I tend to agree.

I pondered the thought of the disappearance of jobs by bots, a thought that Bill Gates talked about at length and one that the Economist ran a big story on in June.

“Software substitution, whether it’s for drivers or waiters or nurses … it’s progressing. …  Technology over time will reduce demand for jobs, particularly at the lower end of skill set. …  20 years from now, labor demand for lots of skill sets will be substantially lower. I don’t think people have that in their mental model,” Gates said, noting that maybe the elimination of the income and payroll taxes and tax incentives for businesses to keep on humans might be a good thing. It’s all technological unemployment.

But I can’t lie: Whenever I order takeout online, my order is always right – not always the case when a human jots it down. It’s not their fault, it’s just human error. And to be fair, the Economist did say there’s an almost 90 percent chance that what I do 9-to-5 right now will be done by robots in two decades. I am sure no matter how many grammar rules I memorize, the amount of times I’ll put in a typo is greater than that of a robot because of human error.

And this “lower end of skill set” is close to a fifth of the American population, as most of the 10 most common jobs in America are low wage work, including retail salespeople and cashiers, office and administrative support, laborers, and janitorial workers. Registered nurses were the only ones with average salaries above the national average of $22.33 an hour – and were also one of the jobs that Gates said might get phased out by bots.

It’s scary to think the way that my father and many people found their way into the middle class would be gone in favor of bots.

This reminded me of the recent reboot of Robocop that I saw in theaters. (Who can say no to Samuel L. Jackson in a wig and car chases?) Being a cop is one of those jobs that you could be a blue collar guy and have and feed a family on. So when I thought about the different forces at play in Robocop, typified by Raymond Sellars who said “Forget the machines. They want a product with a conscience. Something that knows what it feels like to be human. We’re gonna put a man inside a machine,” and Dr. Dennett Norton who said, “The human element will always be present! Compassion, fear, instinct, they will always interfere with the system,” it made me think – 1) where do we draw the line as to what we will and won’t let robots do? And 2) What’s the greater impact of the blend of man and machine?

Now how does this all relate to utopia? Well the 2014 Robocop stars with this vision of saving our brave men and women by putting robots on the street to deal with crime and creating a society where the crime rate can become zero – one of the facets of utopia that people crave. In talking about what these already invented magnet robots Mashable raved about could do, a friend said “The real life applications for things like these robots, 3D printing, and linking the brain to technology is like… woah. What if you had these tiny little robots with 3D ink and have them build a bridge. In my naive view, technology is the key to a more utopian society. If we can get technology to cover our basic needs, then maybe we’ll have more time for art, music, literature, etc.” (I have smart friends, and I don’t think the view is naïve at all.)

Maybe this could work. Have robots worry about fixing our plumbing when it spring a leak, have the state save human lives and send machines into the risky situations on the streets.

But then my thoughts flick to George and Aldous and how they grappled with the use of technology to control society, the control of information and history, and the dangers of an all-powerful state that can exert psychological and physical control of us through doublethink, through controlling the purse strings, through exerting power over heaps of metal that roam the streets assessing if we are a threat without the knowledge of what it feels like to be a man, a woman, a child, to express compassion, to react to fear, to work on instinct.

So what is utopia? Is it robots covering the basics and letting us create the high level cultural assets that will exist long after we turn to dust? Is it a libertarian view of limited government that allows us more freedom to live our lives the way we choose? Is it a progressive view of government that provides fiscal reform and regulation over private entities to prevent corruption? And are we OK with allowing machines control the little things in our lives? When do the little things turn into the big things, and the big things turn into everything?

I surely don’t know exactly how the Internet works, but I use it to pay bills, order food, talk with friends, even post this screed. So maybe the question isn’t are we OK with the thought, or where the line is, but how to grapple with the fact that we already sleep with our phones and freak out when our Internet connections stop. Maybe we aren’t talking about a future we ponder, but a present we already have.

Updated 4/21: Mashable once again teaches me something I don’t know, this time in op-ed form, which bashes the end of this blog post … and in delightful fashion.
The tl;dr version can be summed up with this:
“The varieties of ways in which drones are improving our lives is a great example of how the unintended consequences of technology can often be positive ones — no matter what Hollywood would have us believe for the sake of a story. Hubris was not the only lesson of Mary Shelley’s Frankenstein; it was also written to show how men fear and destroy what they don’t understand.
So understand how slowly and tentatively the new robot race is coming into being. Reserve your fears for more clear and present nightmares — such as the political candidates that will emerge from a campaign finance system that allows for unrestricted funding. And when your ASIMO finally arrives at your home, many decades from now, and pauses to look and smile at you when it pours the tea, remember to smile back.”

“Wait, so why do you live in Arizona?”

Between the effect of local political activists on policymaking, which with SB 1062 raised concerns about if state legislation could try and legalize discrimination; between police firing pepper spray into crowds (and the reporters covering them) that took the streets after the Arizona Wildcats lost in overtime in the NCAA Tournament; between the water issues of Tucson that makes the vibrant and culturally rich city struggle to accommodate the demands of its population; and between the constant news of its internal struggles surrounding women’s productive rights, on tragic falls of Arizona college students that lead to death, on closed-door caucus comments and its militarized border and education woes, I’m never surprised when people ask, “Why do you live in Arizona?”

I am not the typical Arizonan that gets pictured in national media, although I was born here, as was my sister and my father, and I’ve spent a large portion of my life (13 of 23 years of it, as a rough estimate) living within its borders at various times and in various cities. 

I am, however, a fan of its gorgeous sunsets; of its amazing carne asada; of its southern university where I earned my degree; of its ability to house a hipster bastion like Tucson replete with amazing dining and bars and sights; of its mountains; of its more northern cities such as Sedona, Prescott, and Flagstaff; and of its lesser known and less frequently visited places to the south, like Tombstone, Patagonia and Sierra Vista, and further north, like Jerome, Cottonwood and Kingman. 

I am shaped by Arizona as much as I am shaped by being an American, a minority, a female, a writer, a millennial.

It has given me some of my closest friends. It has given me the chance to grow from girl to woman. It has given me a chance to learn, and grow, and enjoy. However, it has also caused me to catch a lot of flak, from strangers and friends alike. (This is not something I’m completely startled by, as I spent a significant amount of time also living in Las Vegas as a child, a city which is as interesting and complex and dirty and brilliant as you can imagine.)

So, the inevitable follow up question, after I list my likes, is: “So do you want to live in Phoenix (or insert Tucson, or another city in Arizona) forever?”

…and my inevitable and steadfast answer: Maybe. Maybe not. I’ve lived in California for a time too, and might go back. Or maybe my parents will finally get their wish and I’ll move back to Vegas and closer to family. Or maybe I’ll move to Tokyo.

But as much as I laugh when 30 Rock lampoons Arizona State University (I’m a Wildcat forever, even though I appreciate the programs ASU houses more than my undergrad school spirit cares to admit), as much as I chuckle when Jon Stewart bloviates on the Arizona legislature again and again and again, as much as I can cop to the fact that Arizona has its share of divisive characters that weave their way into the national political stage, I still like the state. I like its craft breweries and wine market, its Wild West history, its small businesses, and its hiking all while forgiving its missteps borne from a American history of both successes and failures that echo across all of America’s many cavernous parts.

I wouldn’t be the person I am without Arizona, and so I have learned to enjoy it past its faults – at least for now. Plus, unlike Google, I still haven’t seen the Grand Canyon, so I have to stay for a little bit longer, right?

 

 

Black. Female. Millennial. Invisible.

My sister and I were (subconsciously) always tasked with being examples. As the youngest of a generation on one side of the family and the oldest on the other, we were in an interesting spot where we somehow provided an example of what could be done with our genes, our bodies, our last names.

“It’s important to me that they see that they are building upon a foundation … We have to continue to build each generation. It’s important for our uplift as a people and our uplift as women.”

And so, even though my sister and I both went to a major state school, have successful careers, never moved back home to live in our parents basement, and were born in the Gen Y era, we rarely, if ever, see our story reflected in the media. The black millennial story, especially black millennial women, has not been told to great length – especially when you exclude those accounts couched in the trappings most columns of the African American experience tend to touch on (e.g. the effects of stop and frisk on NYC’s minorities, the visions of us in media and in politics, the tale of a multiethnic woman who has become our vision of the Welfare Queen, and so on). I know that as much as I try to skirt the issue people of color still seen as an other – a voting block to be captured, a section of people to be turned into mascots, a stereotype to have parties about.

I meditated on this fact at length as I came to the end of Americanah by Chimamanda Ngozi Adichie. Yes, the author shouted out by Beyonce. But her novel expresses a global and connected world, race relations in America, sexual politics, immigration issues, economic class clashes, and even love in an ever connected, multiethnic and evolving world. (Plus, she’s just a brilliant author and speaker.)

Seeing this nuanced portrayal, one that took five years to make and was now easily available to be picked up, bought, read, re-read, analyzed, shared, and cherished from the bookshelves of my local Barnes & Noble, I decided to Google “black woman” to see what the results would be. They were either non-descript or startling.

Black Woman Google

Although I have in fact occupied many a chair in my day I have never been shot nor thrown a table. C’mon, I’m not a Real Housewife of New Jersey.

The black female portrait in society is multifaceted but also dramatically flawed. (This, I concede, is not a singular problem to the black female, but that’s the life I know so it is the one I will speak on.) We have Oprah and Michelle Obama and Olivia Pope and Mary Jane and Laverne Cox and Beyonce, and also Basketball Wives, Love & Hip Hop, and the aforementioned housewives franchise’s Atlanta edition. We take steps forward and also take them back.

As the millennial discussion continues to be dominated by tales of trying to decode millennials, as if a column titled “7 Ways Millennials Are Just Like You Baby Boomers” can really unlock the secrets to speaking to, understanding more about, and working with millions of people, I feel like those Google results above aren’t just happenstance. They really are representations of what some people see black people to be, and so the black female millennial experience falls into the trappings of the experience of generations before, even if we are not the same as those generations. So having a minority name alone makes employers less likely to consider your resume. So the pains of the Great Recession still hit black and other minority workers harder because of the lack of social networking that we have at our fingertips. So people believe in reverse racism against them in schooling and in work, and think race isn’t a problem but rather just black people playing the race card. So people still feel as if minority youth aren’t safe on the streets while others are startled by their presence and choose to stand their ground.

I am not the kind of overly inspiring black story that gets told in political stump speeches, however. I grew up blissfully, and somewhat ignorantly, middle class with a great nuclear family support system, one that some branches of my extended family ridiculed and other branches did not have and could not identify with.

I’ve seen people I hold dear, in the pre-ACA days, get very sick after working at a company for decades and seem to be pushed out, their loyalty seemingly unreturned in the face of struggle, in the thoughts of the company’s bottom line. That is one of the stories which shape my attitudes, the attitude of a young minority woman, a woman in the tech sector where my humanities and liberal arts training serves me well, a job which I took after leaving a previous one in publishing, a job which affords me proximity to loved ones, to graduate schools, to better salaries to place in my salary history. I was told I needed to work twice as hard to get half as far and be all things to all people, so I did. I did not win awards for participating, but rather was told how I should look in order to be a proper representative of the school on my high school dance team when I did not look like all the other girls. I was not sheltered by helicopter parents that did not hold me accountable, but rather driven my parents who deferred their own comforts for my well-being and achievement so much so that paying them back with graduating with honors from high school and college seemed to be payment in kind.  And they don’t have no awards for that.

I was raised understanding that black women typically earn less than their counterparts despite the degree they attain, that black women still face higher unemployment despite making gains in education, that black women’s health is constantly more at risk, that black women tend to have lowest rates of marriage, even though those claims are continually under debate.

So can I really be blamed if I am less likely to attend a church every Sunday, less likely to stay at one company my whole life, or less likely to marry early? And more importantly, is it fair for media report upon media report – created by an industry that continues to say it wants more diversity and more youth in its ranks while most never really act on it  although some try (and industry that I love and majored in by the way, so I am particularly sympathetic to its struggles) – continue to create a picture of millennials that is so one-dimensional that even though I adhere to some of its stereotypes, I can’t even see myself in it?

It’s a question to which I don’t have the answer, or even the confidence that it’s the perfect question in the first place. But I’ll keep trying to learn, and keep hoping that one day I’ll look out and see myself – not merely for my own sake, but for those I’ve met and also have yet to meet, so that when they see me, they don’t see stereotypes or misconceptions, but just me.

Good ol’ fashioned book learnin’

I have been resolving to read more this year, an unofficial New Year’s resolution, but more just a resolution to include things that enrich my life into it, including but not limited to eating better, exercising regularly, learning how to play more than three chords on the guitar and more than just super basic HTML/CSS code, and writing for pleasure (including on this blog).

In college, professor after professor would say that it’s necessary to read for pleasure and outside of assigned readings to become a better writer. I would scoff at this, not because I didn’t want to, but because I had no idea how to fit it in with a full course load along with three part time jobs and internships depending on the semester. I envied the people who seemed to have time to fit in reading giant books along with class, rationalizing that they weren’t working at the newspaper, and grading essays as a TA, and working as a desk attendant, and trying to finish an Honors thesis as to graduate in four years.

So college came and went and I read for pleasure in limited amounts, sprinkling in a Buzzfeed or The Atlantic longread on the way to work, during a lunch break, as some reading material before bed. I started and failed to finish many a book, stopping and starting and slogging through a couple trashy and more respectable books cover to cover. But I couldn’t ever muster the time to take in books, claiming I was too tired after a day of work.

Then I saw that in February my friend Kristina Bui, a copy editor at the Los Angeles Times and an all-around awesome lady with fantastic book taste, seemed to consume books with the rapidity that I, say, drink water or breathe – so I resolved to finally get off my butt and read more. (Seriously, her Instagram has seen more book covers than I had in the first month of this year.)

This change, for lack of a better word or maybe for satisfaction with the simplest word, changed everything.

I usually tend to fall back on classics when I am trying to get myself out of a reading rut and although Walt Whitman’s Song of Myself was a collection of lovely words which has altered the way I view things for the better, I decided to creep people’s Facebook pages and tear through recommendations from friends of books written closer to when I was born than say Oscar Wilde.

Ender’s Game, The Fault In Our Stars, The Hitchhiker’s Guide to the Galaxy and Me Talk Pretty One Day all have taught me new things. And the parts of Americanah and The Brief Wondrous Life of Oscar Wao that I have read prove to be just as shifting and altering for the way I view the world. (If this sporadic reading list isn’t indicative of my constant need to mix things up, I don’t know what it is. And I am sure the next books sitting in my mental queue – the highly recommended Stay Up With Me and On Such A Full Sea – will add more names to a patchwork of enriching books that have made inroads into my brain.)

As I was walking the halls of Barnes and Noble on my way to purchase the Chimamanda Ngozi Adichie and Junot Diaz books above, I wondered if only people who spent lots of money on a major dealing mostly with words felt the way I do. Then, I heard a construction worker I know talk with earnest about how disappointed he was that I had yet to read Ender’s Shadow, a complimentary book in Orson Scott Card’s famed Ender’s Game series.

So I guess it’s not just me.

In fact, researchers found “critical, literary reading and leisure reading provide different kinds of neurological workouts, both of which constitute ‘truly valuable exercise of people’s brains.’” But this doesn’t just happen when you are reading. Other research has found that days after reading a book, not only can the story stay with you, but neurological changes in the way information is processed. “At a minimum, we can say that reading stories –- especially those with strong narrative arcs -– reconfigures brain networks for at least a few days,” a researcher noted. “It shows how stories can stay with us. This may have profound implications for children and the role of reading in shaping their brains.”

The most encouraging thing of all for me though is that reading hasn’t discouraged me from producing my own words, but rather made me more ravenous to write on my off time, even those words reside in a Google Doc, waiting for ruthless edits or to be dumped and replaced with new, better ones.

Research again rationalizes my wild whims, proving voracious need to write day to day isn’t word nerd related either. In fact, non-writers recalling their writing prove to me that when I tell people anyone can (and should) write, that I am not full of it.

“So I write. I write because it’s hard to remember everything. I write because it’s become a relaxing habit. I write because it’s private. Yeah, all my writing today starts as a private note. Too many people are afraid to write because of the time commitment or the resulting discussion. It’s an increasingly large problem due to the growth of the Internet and privacy. We no longer really ever find ourselves alone. And it’s because of this I choose to write privately first – with the option to share if it’s what I would deem a shareable thought.”

I guess the moral of this story is that don’t be surprised if there are book recommendations and stories of failing at becoming the world’s best guitarist or web designer to come.

Can You Be Nice, and On Top?

For a long time I have wrestled with whether or not to be nice. I know your mom told you it’s always good to be nice. But especially as a woman in the workforce, there’s this unspoken rule that you have to prove that you can hang with the guys in order to be the boss. Be tough, but don’t lose your cool or get angry, otherwise then the pendulum swings too far in the opposite direction.

In fact, I had someone on a hiring panel tell me – after I lost a job to a guy less qualified and (of course) less gregarious than me (gregarious was their word, not mine, although I do like it) – that there’s something that might help: being meaner. More colorful language was used to describe this, but I’ll spare you all of that.

I don’t fault the guy for telling me that because in general he is right.

Women in traditionally male occupations can either be viewed as competent (a significant hurdle…as evidenced by the MIT study) or liked (which, it turns out, is really important and for more reasons than just a desire to be popular)…but rarely both.

I mean, unfortunately, really right.

Studies have long challenged the idea that nice guys finish first. Being kind and considerate in the workplace has been perceived as a weakness, and an invitation to disrespect, and indeed studies have found that such behavior does not seem to come with many rewards.

… Facebook COO Sheryl Sandberg writes, in Lean In, about the numerous instances in which being overly accommodating — not taking the best seat at a meeting, waving off praise, underestimating their billable hours to avoid overcharging — holds women, in particular, back at work.

So I retraced it. Was it the cardigan and pearls I wore? Did I giggle too much? Do I smile too much when I should be “meaner”?

As I came to realize that if I had to change or question who I was as a person in order to get a job that maybe the job wasn’t for me, I wrestled with how in the future I should present myself. Is me being jovial holding me back in my career? Do people take me less seriously because of it?

And then here comes Jimmy Fallon.

Jimmy Fallon proves it’s possible. Unlike the Philly magazine column about him, which ironically points out not a single new thought or complaint about Jimmy Fallon’s comedy while accusing him of the very same atrocity, Fallon is nice.

Saying that he shouldn’t do impressions mocks both the heritage of Fallon’s rise to stardom on Saturday Night Live as well as the fact that he does more than impressions, he gets the celebrities in on the joke. He gets Bruce Springsteen to mock himself, and gets Michael McDonald to sing “Row Your Boat,” and gets Barry Gibb to sing as he does a crazily outrageous version of his ‘70s persona. He even gets Jerry Seinfeld to mock his own voice to match his impression. Celebrities and public figures are best when humanized and imperfect as seen by the absolute love of Jennifer Lawrence being irreverent and speaking from her heart at every turn. Even Netflix banked on a Mitt Romney documentary that did that same thing – humanize him.

Sure, Jimmy might over-laugh at a guest’s jokes. Sure, he’s not the best interviewer. (But considering most late night talk show interviews are a rehash of pre-approved talking bits about a star’s latest vacation or a singer’s new dog, I think I’ll take Natasha Richardson playing charades or Drake playing flip cup over recycled, boring, and overly sanitized anecdotes anyway.)

He’s in the zeitgeist, and that’s why he gets to do the Tonight Show. That, and of course, he is nice. Unlike Mr. Philly mag, there are a lot of people who have noticed this in Jimmy Fallon’s Tonight Show.

I know when I first started watching Late Night I thought, “Ugh, the guy who could never keep it together on SNL? WHY?” and then weeks went by, and then months, and he found his groove as the head of a three ring circus variety show put into high gear and I had to eat crow. The show is watchable, shareable (which is ever important), and just plain ol’ nice and fun. And as he is proven in interview after interview, no one is more excited to be doing that job than he is.

There’s a reason that now Upworthy headlines are all the rage, and widely imitated with varying levels of effectiveness as a result. For so long as a society – especially in media and entertainment – that if it bleeds, it leads. But people grow weary of negativity, especially in an environment where our elected officials hit scandal after scandal, our neighbors and friends struggling to get by in a recovery just standing up on its Bambi legs, and our world becomes more interwoven yet disconnected every day. Now, media realizes that something heartwarming and fun – even if it isn’t perfect – isn’t just wanted, but is needed.

Jimmy Fallon is an inspiration, in that if he can get to the top of the heap, and do it while being imperfect, giggly, and constantly learning to be better, then isn’t there hope for all of us that nice guys and girls don’t always finish last? Sometimes, they come out way ahead – and all while being nice.

#Kindness

Warning: This post will be more optimistic than ever usual. You have been warned.

A hashtag can be many things on the Internet: a witty aside (#adultolescence … a personal favorite of mine [seriously, it’s in my Twitter bio, not even sorry about it]), a meta statement (#hashtag), a marker of participation in a major event (#SuperBowl, #HouseofCards) — or as it has lately become a call to action in support of an idea or a person, no matter the level of controversy surrounding them (#solidarityisforwhitewomen, #notyourasiansidekick … or the latest #freeincognito or #dangerousblackkids).
Much like journalism’s original purpose, the Internet at its best and its worst gives voice to the voiceless. This gives a lot of the principles of American free speech to the masses, and allow voices of the public to say, spit, and spew more or less whatever they wish.
(Here is my free speech side note: Like any other right, free speech has limits. This is something people tend to forget or ignore when the speech serves them and get frustrated by when that speech violates the harm principle, or delves into the lurid or hateful.)
Free speech at its best serves a purpose, but many condemn hashtag activism, as well as doing things like liking Instagram pictures, passing along Upworthy videos, or sharing good natured Facebook photos as meaningless attempts at care for others — another symptom of the narcissism bred and encouraged by social media users. The thought: I’ll share this so my friends think I’m a good person, and I’ll feel better, without doing much of anything to change much of anything.
This is where we get the Konys of the world and so many other viral stabs at progress.

[This apathy is known as pluralistic ignorance:] “Views, comments and “likes” often feel like a powerful online currency to the recipient but they are cost neutral in the sense that virtual disapproval doesn’t commit the individual to real intervention.” This is a state of collective belief referred to as pluralistic ignorance in social psychology and it doesn’t get any better while everybody stands on the sidelines watching while the ignorance goes uncontrollably viral. In doing so we aggravate the problem. We personally contribute to the bystander apathy with every supporting “like.”

But the origins of hashtag activism I think are the most potent with the Arab Spring, which I followed closely in my International Journalism course a handful of years ago. (Mort Rosenblum, by the way, is one of my favorite journalism people ever. Buy Little Bunch of Madmen and read and love it. I cherish my copy to this day.) But this movement, where people without a voice, used the Internet and made it its best, organizing themselves, making the best strides at powerful citizen journalism and crowdsourcing sources, and bringing the organizing powers of social networks for good to the forefront.
So are Twitter, Facebook, and a host of other social networks really the death of society? Is it true that “”it’s a place where intellectual laziness is encouraged, oversimplification is mandatory, posturing is de rigueur, and bullying is rewarded … a place hateful people are drawn towards to gleefully spread their hate, mostly without repercussion.” Should we stop looking at the web and start looking at the “real” world?
I don’t think they are mutually exclusive. Technology has kept me in touch with family and friends across the city, the state, and the country. It has broadened my education and made pursuing it further a distinct and palpable possibility. It has kept me connected to others whom I love and couldn’t see every day as our relationship bloomed.
But it has also allowed people to blindly and namelessly assert threats and threaten violence against countless people, including myself. I could list the things I’ve been called online but I’ll spare your eyes and not spread the sadistic, narcissistic ramblings of trolls in the online comments as studies have now called them.
So is there any merit to using the social networks which bind us to bind us in activism, kindness, and real change? Or to put it better:

There is significant power in the online world and the effort to hold people accountable for their actions. In some senses, the Internet has become a “voice of the voiceless” for people whose stories have historically been marginalized (or straight-out ignored). Facebook and Twitter serve as an immediate way to gain support, but seeing the social conversation over this past year leaves me with the question: How successful is online activism in relation to making changes in the “real” world, and what are the next steps?

There have been some steps toward translating the share or the like or the hashtag into something more palpable. One company made strides to make those pictures you post from happy hour sushi worth more than a few likes on Instagram, taking “foodstagrams” and translating that into feeding the hungry.
There was recently a Random Acts of Kindness week where a hashtag bound people to do something nice for someone else, outside of a texted donation to the American Red Cross that only surfaces when a surprise disaster strikes in the case of Hurricane Sandy. This week called for small random acts of kindness, where you can smile extra at a person on the street or park further away to give a pregnant or elderly woman a shorter walk into the grocery store. There’s even an app, called Kindr, linked to this movement.
The spirit of HelloGiggles tries to accomplish this, fostering a place on the Internet with positive messages via a mandate coming from Zooey Deschanel at the top and spread down through its contributors, but that can’t be the only place where that kind of spirit exists. It has to be more than a call for nice favors. We might not be able to make the entire Internet a nicer place to be, but we can use the power and connectivity that the Internet provides to make kindness more of a priority, where respect and change can be more prevalent than trolls and negativity.

The value of not getting a STEM degree, from someone working in STEM

Coding is a skill. If you want to become a developer, then sure, go to school for information technology or computer science and become a database administrator at a software company. However, I work in the IT department of a tech company and did none of that. 
That means I look at problems differently than someone with a tech degree, and it also means that I get to funnel into jobs in tech that require critical thinking, problem solving and communication skills: technical writing (my current day job) as well as business analysis and information development and engineering. 
Plus, I still get to freelance in journalism and craft the prose I like in my spare time. 
It’s imperative to development to learn not only how to think but like David Foster Wallace’s influential 2005 Kenyon College speech implores, what and how to think, in order to think outside of yourself and your space, and enter into decision making and pondering on subjects that have a vast use outside of your own knowledge base. Recognizing “this is water” takes skills that might not lie in a STEM education.
As I mention in a blog post on learning how art and science intersect in my own life, I have grown a new appreciation for my study of social sciences and the humanities in college. 
And here’s the dirty little secret: even though it seems like my concentrations led to mostly waxing poetic and eating empanadas (which don’t get me wrong, is amazing), I also spent time learning how the economics of colonization shape our financial dealings with Latin America, and the mechanics of the human machine and how our evolution of bipedalism and quantal speech helped us to take advantage of our cognition and create the culture we herald today as essential to the human experience. I learned how to communicate with people outside of my normal sphere in a way that assures I take an international view, an essential skill in a global economy. And the very best part: I learned how to communicate clearly — which is an undervalued yet extremely coveted skill, one that millennials all too often get accused of not having — as well as how to discern what is good information and bad information and learn the ethics of the truth and the messaging I put out into the world — tenets of journalism which sometimes aren’t practiced, but are so crucial that Ezra Klein left a job at the Washington Post and drug Matt Yglesias from Slate to help create a Vox vertical dedicated to context and pure information to fill a space that is in dire need of filling in the age of over-information
This does not mean that I don’t believe in the value of coding. In fact, I see some real benefit to having computer science (as well as basic media studies and economics courses as well) added to a base education for students today, to create a more well informed public that knows exactly what they are voting for and why (or at least knows as much about the devices it uses to connect with the world as it does about the three branches or government or the basics of a sentence). 
But saying that humanities lead to dead ends seems false in so many ways that it hurts me to think that people might take false reports of a lack of humanities majors as a sign to shy away from those studies. 
Especially since I’ve learned the basics of HTML and CSS web coding and design as easily as I have sharpened up my Spanish skills while delving into learning Portuguese and Italian using free online tools that make me somewhat rueful about the small student loan I’m paying down right now. (Seriously, I probably could have makeshifted a degree from MOOCs and online services. I probably still could. I mean the MBA I’ve pondered and many have gone back to school to achieve would be easily obtained online — too bad that wouldn’t translate to a resume-padding, employer recognized degree.)
My wandering point here is this: there’s more to life than becoming a STEM drone worker bee that took a “profitable” degree rather than studying a subject that will expand your mind and your worldview. This point comes with a corollary that demands that liberal arts, humanities and social science departments should both play up the aspects of these majors which aid students far beyond a simple skill or two and also beef up their interdisciplinary and dual degree programs to make it easier for students to pursue two different degrees and get both the job skills they need to succeed in the workforce and the critical thinking skills they need to become more enlightened and participatory world citizens.
Especially considering you can get a coding job without a college degree, but you can’t capture the time and freedom that your early 20s allows to best spend a summer in Italy taking photos and drinking $1 wine while studying art history or stumbling into your interest in the sociology of the body or how the biological evolution of humans influenced the culture they later created.

What George Clooney and Steve Jobs taught me about how life, art, and innovation intersect

On a day just like any other, where I scrolled through links on the Internet, consuming a mind-numbing amount of media, I read a George Clooney quote .

“And with the end of a country’s culture goes its identity. It’s a terrible loss, down to your bones.”

And it started me thinking on my own thoughts about the importance of culture in my life.

My mother was a heavy proponent of art. She used to come to my elementary and middle school and my friends would call her Mrs. Woodberry as she brought in Otter Pops and taught people about classic European and African art. There’s a closet in my parent’s home that still is stuffed with “unfinished” paintings that my mother has done year after year, the ones that she sometimes gives away to family members and friends, the ones that I hope to hang in a home or cool beachside loft one day.

To this day, my place has art prints from different regions hung up to mimic the surroundings of a real adult as well as to reflect the way that I was raised. Art – be it writing, or music, or paintings, or dance or sculptures – was and is essential.

So when I decided to try my hand at a creative outlet when I grew up and started in journalism school, I thought: culture and art, that’s what I’ll report on. Whenever someone asked what my dream was I would say something along the lines of writing features at a midsized daily, doing personality profiles and writing about cultural intersection and about the arts.

And for some reason that was derided.

Not that people thought it was a bad job. Just that it was dessert. That I was good at what I was doing, and I was doing too well to “waste” it on culture. So after they’d say, “Write about what you are interested in” and I’d tell them the real answer, they’d assure me to write about business or write about politics. That’s “real” news.

So I’d try and straddle the line, writing about border and immigration issues but end up writing features on civil rights crusaders and about plays that humanized the experience of immigrants. I’d write about education, but spend days and weeks learning about a little boy who swallowed a button battery and taught his parents how to love and appreciate life in the process, while still being the cutest boy in the history of boys. And I thought – well, this will work. It’s not always “real” news, but it makes me happy. And I did it with a sense of defiance. Who are they to tell me that I can’t have my cake and eat it too? Culture is important, and I’m on a mission to prove it.

And then I’d see things like this in the NYT:

“For many of the high achievers I spoke with, music functions as a ‘hidden language,’ as Mr. Wolfensohn calls it, one that enhances the ability to connect disparate or even contradictory ideas … Consider the qualities these high achievers say music has sharpened: collaboration, creativity, discipline and the capacity to reconcile conflicting ideas. All are qualities notably absent from public life. Music may not make you a genius, or rich, or even a better person. But it helps train you to think differently, to process different points of view — and most important, to take pleasure in listening.”

And things like this in the Atlantic:

“Creativity alone does not foster innovation, nor do abstract scientific or mathematical concepts. Innovators also need to know how to render those creative ideas into working products that can be put into use.

In order to bridge the chasm between abstract idea and utility, some educators are advocating for an expansion of the popular STEM acronym—Science, Technology, Engineering, and Math, the list of skills many experts believe more students need. They believe STEM should include the letter ‘A’ for ‘art and design.’ As Margaret Honey, CEO of the New York Hall of Science commented in an STEAM workshop at the Rhode Island School of Design, ’It’s not about adding on arts education. It’s about fundamentally changing education to incorporate the experimentation and exploration that is at the heart of effective education.’”

And I felt like my thoughts were validated. Ah-ha! Art means something. Music means something. Paintings, songs, poems, sculptures, orchestras, rock concerts, popular culture, subversive counterculture – it all means something. Economics and politics and business are how we create the societal structures we live in. But our culture is what we live for. And that’s the most important.

So why is it that people are so against culture as valid? I will be the first to admit that pop culture is not always defensible. If the Golden Globes never aired again, we’d all still go on living and the world would keep turning. (But then we wouldn’t get any cool Jennifer Lawrence gifs the next day, or get to see Lena Dunham in her 20-something, I’m awesome glory, and that would be sad.)

“In the United States we are raised to appreciate the accomplishments of inventors and thinkers—creative people whose ideas have transformed our world. We celebrate the famously imaginative, the greatest artists and innovators from Van Gogh to Steve Jobs. Viewing the world creatively is supposed to be an asset, even a virtue. Online job boards burst with ads recruiting ‘idea people’ and ‘out of the box’ thinkers. We are taught that our own creativity will be celebrated as well, and that if we have good ideas, we will succeed.

It’s all a lie. … ‘We think of creative people in a heroic manner, and we celebrate them, but the thing we celebrate is the after-effect,’ says Barry Staw, a researcher at the University of California–Berkeley business school who specializes in creativity. ‘As much as we celebrate independence in Western cultures, there is an awful lot of pressure to conform.’”

We both celebrate and deride things that are creative and artistic as less valid. If I had a nickel for the amount of times I’ve read a headline that says “Run away from humanities and head to something pre-professional before you never get hired anywhere ever” or something of that ilk, I would be rich – or I’d have at least enough money to buy an iced tea and some Sun Chips from a vending machine, which is almost as good.

And as I saw myself moving from a job where I was writing about gluten-free cookies to one where I would write about web-based applications, I struggled to explain how I connected (or more like justified) the jump between the two.

And then, like he has done for so many others, Steve Jobs taught me a lesson, via the Harvard Business Review:

 “‘I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.’ … [Jobs] connected the humanities to the sciences, creativity to technology, arts to engineering … The creativity that can occur when a feel for both the humanities and the sciences … is the essence of applied imagination, and it’s why both the humanities and the sciences are critical for any society that is to have a creative edge in the future.”

And I realized that there was a reason that paths cross and circles complete. And I realize that I don’t have to compromise to have culture and technology. And I realize that my mom was right and I’m happy to have had her do what she did for me when I was young. And I realized – I was right, and also shortsighted (which technically means I was kind of wrong too, but who likes admitting that they are wrong?) Culture is important, but there’s a way to have your dinner and your dessert, to work in the “real” world and dance in the pop culture cotton candy yet totally soul enriching world of creativity, culture and things of real consequence.

Plus the New York Times and Google agree with me … so there.

P.S. For this and so many other things, thanks Mom.

Running a brothel to pay for a cathedral, and who should sit in the pews

My good friend, blogger extraordinaire, and all around kick butt woman Laura Donovan and I were having an email exchange. This is one in a long line of mini-novellas I will write to her and other long distance friends to 1) fly in the face of a culture that says millennials have no attention span and 2) because Gmail is better than Facebook messenger for longer emails. I can hit enter without fear! But in this conversation came out shared feelings of less than affectionate feelings toward the state of journalism today, especially when it comes to the state of the staff writer.

From the girl leaving journalism with a viral video — which her employers promptly copied without thought, sense, or new content to capitalize on her clicks and prove her point — to the clickbait stuff BuzzFeed posts every day (that I take the bait on all the time, so I am not a total hater on this media style), to the fact that many news startups are now favoring listicles over reported news articles or even smart cultural critique, the landscape of the journalism that I learned and sold and fell in love with in journalism school isn’t really there anymore.

It takes a Neetzan Zimmerman churning out clickworthy posts in order to allow the rest of the staff write about stuff that matters – most of the time. The brothel pays for the cathedral.

It doesn’t mean that good journalism isn’t getting done. It means that it’s not getting done in the way that I thought I would: sitting at a desk, full-timing it with benefits and 401(k). It’s been called the death of 9-5ing journalism in favor of 5-9ing“A journalism qualification will probably not get you a job. It may help you make a living.”

And maybe it’s not a bad thing for the freelancers to fill the fill the pews of the cathedral and have a hardworking staff running the brothel.

It’s always been a “rock and a hard place” feeling for me where my biggest breaks in journalism and most exposure online came with tragedy, turmoil and the decline of good writing, whether it being penning stories for Reuters due to the fallout of a mass shooting in a city that I love, whether it be winning an Employee of the Month award for writing about someone trying to take a stand against civil rights violations in the wake of the wretched SB1070 law, whether it be getting the most praise from education editors when I wrote about things that went wrong rather than right, or whether it’s the continual butchering of smart, cultural critique and media analysis — the things that I like writing the most and have the smartest things to say about, and turning them into drivel.

It’s hard for me to reconcile it with myself. I have gone to 5-9ing journalism, where I pay my bills with writing both in purely journalistic means and also through technical writing for a software company.

I’d be a fool not to take advantage, and I’d be a fool to not think that I would be a better tech writer after working with programmers, business analysts and tech gurus. Educators tend to write particularly compassionate and telling education stories, and lawyers do the same with law, and singers with music, and painters with art — you get people with experience writing passionately about fields that they love and taking more time to do it because instead of being a jack of all trades relying on pumping out blog after blog to make rent they are just freelancers earning some side money along with their regular pay.

It’s not hard to see why people who are making big waves today, like Amanda Hess with her expertly crafted piece for the Standard Pacific, are freelance writers, crafting their own ways with topics that they are passionate about. I can’t blame people from the outside thinking, why do people sign up to be staff writers, thinking: “Do I really want to spend the rest of my life in a newsroom where the threat of constant buyouts or layoffs looms large? Where bitter cynical journos treat depression with Jack Daniels and wonder what happened to the good ol’ days? Where my job in theory is heralded and virtuous but really is being reduced to clicks by all means necessary?“

There’s nothing about journalism school that prepares you for that bitter pill, but for me, it’s one that once swallowed give me a new perspective on work. “Do what you love and love what you do” in fact devalues “actual” work, but it also doesn’t allow for pockets of passion or shades of gray – does loving what you do for your freelance work mean that you failed with your dream since it’s not your day job.

Amy Spalding does an interview with Anna Swenson, a former colleague of mine at the Daily Wildcat, over at HelloGiggles on this issue and her insight on pairing a day job and a passion speak to this point so well:

“I’m not someone who is comfortable not knowing when my next paycheck is hitting or how much it’ll be. The pressure a steady job takes off gives me more mental space to be creative.

“So with that said, if my writing career gets to the point where it’s a choice I could make, those are a lot of good reasons I would still choose to keep the day job. I feel like there’s a tendency to think that if a writer doesn’t completely support themselves with writing that they haven’t ‘made it.’ There are so many ways to make it! So I try not to think of paring down to one job only as the goal. I do my best to live in the now with my careers.”

Just like it might be a good thing that law school is churning out less lawyers, maybe it’s OK that colleges are churning out less journalists. You need watchdogs to bark about big problems, but there are way more lapdogs than watchdogs in journalism because being being a lapdog that pays rent and gets to send his kids to college is preferable to being a venerable, steadfast watchdog that keeps getting laid off from paper after paper because investigative budgets get slashed and viral content gets heralded.

“What factory that we’d once hear about dumping toxic chemicals are we not hearing about anymore?” asks Ted Drozdowski, a onetime Boston Phoenix editor. “There are less watchdogs, which is why we hear less barking.”

Because there were plenty of business journalists well employed at high quality papers who missed the mark on the financial collapse when the signs were there early and people still debate what the press’ role in covering (and not covering) climate change was and still is.

I really love journalism, but maybe the staff writer is dead – or maybe my dream of being a staff writer has died, when I learned that liking what I do for money and loving what I do for me is just as good, if not better, than monetizing passion and losing love for the craft all together.

Or maybe in three years I’ll realize I was full of it.