Welcome to the Modern Moon Life

Stories from a shift from the masculine sun-based energy to finding a feminine moon-based life.

I Don't Want to Manipulate Emotions to Sell Things Anymore

I Don't Want to Manipulate Emotions to Sell Things Anymore

rodion-kutsaev-0VGG7cqTwCo-unsplash.jpg

I’m a digital marketer who doesn't want to use my skills to sell goods or services anymore. 

It’s not the goods and services that bother me, it’s the selling part. The shilling of wares. We have manipulated emotions and core feelings to market things for so long without stopping to think if we should. Meaning, yes, we can use these tactics, this technology, to sell things, but should we? 

Advertising Was the First to Manipulate our Emotions

An article from the 1904 issue of The Atlantic talks about how combining writing copy in newspapers and magazines through the lens of psychology will only strengthen the message: “In passing to the psychological aspect of our subject, advertising might properly be defined as the art of determining the will of possible customers. . . . Our acts are the resultants of our motives, and it is your function in commercial life to create the motives that will [affect] the sale of the producer's wares."

And that was before TV and other digital avenues allowed for other senses to captured, other aspects of psychology, to be secured so fully, and without the awareness of the intended recipient. Over 100 years later, with the exponential curve of technology, it’s not always a conscious choice when and what ads are being served. 

The Way We Receive the Ads Also Uses Emotional Manipulation

It’s not only Facebook or Instagram or Google, the phones - the devices where the ads are being served on, are so dangerous in themselves. The way notifications are programmed, down to sounds and color choices, were all done with that same psychology lens in mind. Their goal is to elicit more interaction. But, again, at what point do we ask - just because we have the technology, should we use it? For that morality - that question - is what separates us from the machine-learning gadgets - the ability to add an element of humanity, of curiosity. 

The dystopian novels write about “big brother” listening or machines learning faster and taking over humans. The truth is, that is all already here, it just doesn’t wear a human face. For if it did, if it was dressed up as a machine to look like a human, acting the way the iPhone, Alexa, and Facebook algorithms do, we would have something to “fight” against. We would (more easily) recognize the lack of humanity when presented with it in humanoid form? Instead, it is packaged in sleek white plastic/metal/glass (with black or rose gold options) and becomes an extension of our selves. 

Another limb, our devices become a body part we can’t live without. And why would you fight yourself? Even if you know it’s harmful (and you may think - but it’s only harming me). 

That’s how addictions work. 

But you are part of a whole. As living organisms, we are all part of a bigger picture. One that machines aren’t grounded into. (A yet feels too ominous here.) 

Can We Change?

I do believe collective humanity and consciousness is the difference - is the key. As a human writer, I have the choice at this point to leave you, the reader, with these scary maybe-truths (for it’s all a matter of perspective in your personal beliefs). But I won’t because, ultimately, I do believe it’s not too late. 

I believe in the resilience of the human spirit. (Ask any sleep-deprived mother of a newborn, or parents of children with special needs, or loved ones helping a sick person who is like-family fight that illness, and see how far they have gone beyond what they thought they could). When we come together, we go beyond what we believe we can do, where we can go. 

What does that look like here? The truth is, I don’t know. I don’t think any one person can answer that. For me, it would be important to have human-based oversight into the morality of the design and coding of these devices. Into better rules and transparency around how targeting ads work. But one that doesn’t come from anyone with commercial interests, and that rules out government and the companies themselves. 

I would want a truly diverse group of humans to help bring perspective. Again, young(ish) white men are making all of the decisions, writing this code. It’s not ok. It’s not even a majority of the world’s population anymore (or ever). I would like to adopt the motto: “Just because it’s new, doesn’t mean it’s better.” 

Apple recently reached the world’s first $2 trillion market valuation, only 2 years after being the first to hit $1 trillion. I’ve thought a lot about that and giving back, as in charity, but I realize that they, along with Google, Amazon, Facebook, and a few others, have the biggest opportunity to change the world from within. To ask the question: Is what we are creating going to help the world or just help our bottom line? 

Who do we report to: our shareholders or the children who are inheriting the world we are shaping with our devices? I know who I, as a cis-woman, mother, partner, friend, sister, daughter, entrepreneur, and community member would choose. 

So, yes, I know how to manipulate the algorithms to help sell more things. I can write copy and create design strategies that will catch the eye of a target demographic. That will help someone to fill a need (they may not know they have) to sell a product that will “help” them to feel better. I can mine data to refine those strategies to get better results. I can create a story to help them connect and justify their actions in purchasing. But I don’t want to do that anymore. 

So what’s next? 


Found and Lost - Trust in the Unknown

Found and Lost - Trust in the Unknown

I love the moon | how it empties its shine | into morning