Posted by admin On December 15, 2017
“Artificial Intelligence” has become quite the buzzword these days, especially in an era of never-ending buzz words. Every day there’s excitement, enthusiasm and mesmorizing anticipation. Along with all the drama of the ages as mankind seemingly climbs the evolutionary ladder a dozen rungs at a time. Like everybody else, I have my own opinions, but what can differentiate mine from anybody else’s. To go along with all the hype comes a tremendous amount of absolutely useless noise. How can my own little voi ce rise above the noise and not become a part of it.
Although I believe that artificial intelligence is overestimated in many respects, I am nevertheless very turbulent about its potential, even if one takes into account all the noise. A large part of this work concentrates on its peculiarities, especially the one where artificial intelligence will outstrip human intelligence. Previously I had to ease mentioning “artificial intelligence”, so it didn’t sound too “sci-fi” and I was scared of people, and now observers are disappointed if your solution is not completely magical. They showed us what we cannot achieve (conscious programmatic intelligence), but how we can create something less dramatic, but at the same time very valuable: unconscious programmatic intelligence.
But, unfortunately, I think people spend WAY too much time thinking about enhancing a machine’s intelligence to the point where it “turns” on mankind. We should be thinking about the possibilities of enhancing our OWN intelligence instead. After all, AI is just a tool like any other. Apparently we fear our machines because we fear ourselves. It’s the “augmented” self we need to become aware of. We are “augmented” whenever we pull out google maps on our cell phone to find out where we are…or send a live video to our friends on Whatsapp and Instagram. We don’t have to try to “out think” a computer. We just need it to help US think. THAT’S the point! We just need to get better at that and then watch what happens.
Any account on Facebook or online is already a link between the human and some artificial intelligence.
Just as the brain recognizes patterns and helps us categorize and classify information, neural networks do the same for computers. Nothing makes it impossible to appreciate human intelligence, such as finding out how incredibly difficult it is to create a computer as wise as we are. Any attempt to interpret human behaviour primarily as a system of computational mechanisms and the brain as a kind of computer apparatus is doomed to failure. If an intelligent machine were able to distinguish between intricate ones, if the dark regularity in the data about what we have been doing in the past, it could be able to extrapolate about our later desires, even if we do not fully know them.
Then I read about the huge commitment of the world-wide software industry to artificial intelligence and neuroscience. Starting from language development and the creation of large, densely populated communities, moving forward with writing and printing inventions and now enhanced by tools such as the Internet. Collective intelligence of humanity is one of the main reasons why we have managed to overtake all other species. Until the 1990s, despite punishing computer chess movements, we were still not very close to artificial intelligence in general. Therefore, while man can always act in new ways (regardless of how impressive our brain is, and perhaps even a cocktail with very human features of intuition and spontaneity), computers will become stuck when they come across situations where they are not told how to act (or how to learn to act).
But the real “proof of the pudding” is this article itself. It is solely my article and original. However, a part of it was written by me by hand, and a part of it is written by “Stinky” my AI ghost writer. Can you tell which is which? It won’t be hard to tell the difference between “fact” and “feeling”. After all, that’s what good writing is all about. Just remember, if you judge this to be a good article, it’s not because I used AI to write it, it’s because AI made ME a better writer. I don’t mind a little “extra” intelligence. Do you?
Posted by admin On July 8, 2016
This is getting deep. As I become more and more immersed in the world of digital advertising, I’m realizing how complex this all really is…and, of course, still so simple. Advertising has always been about getting and holding people’s attention…an hour, a minute, a second…attention + time = value. And that “value” is how advertising agencies get paid. But in the amorphous world of attention, we can’t always assume we have it, unless, of course, we do. By that I mean , the 20th century model of attention as holding a someone’s attention for a couple of hours or an hour or a moment for that matter. Once the advertiser has your attention, they can provide the occasional “public message” or commercial, assuming that if you were still glued to the set. If spellbound by CSI (Miami, New York, LA…etc) youwould be just as “glued” to the soap sell. Of course, we all know that’s not usually the case, this is where most of us flip thechannel, make a sandwich, mentally tune out. But there are always exceptions. If you can just catch the right someone at the right time with the right product then SHAZAM! you have a sale. Odds are against it, but if you millions of eyeballs sucking in your message, your bound to get some bites. And even without the direct response, being the greedy little consumers that we are, we inevitably shop for things we know nothing about…from soap to snowblowers, and as we’re walking through the shopping aisle hoping to find brand Whatever, we find brands A to Z. So many to choose from, how do I choose? Why I see Brand A on tv all the time, Ill buy that one…even though Brand B is cheaper and probably superior. But that’s what those advertiser dollars are for.
Unfortunately, those dollars can’t be exactly co-related into actual dollars and cents. Advertising, like economics, is an inexact science. Well, it WAS an inexact science anyway. Now its becoming almost pure science. I use the television analogy for 2 reasons.
1) I am still, yes I hate to admit, a regular television watcher. No I don’t watch network shows or American’s Got Talent. Mostly sports, old movies and nostalgia and even then, though, I shouldn’t admit, I have to trust my DVR to save the day for me. In one nostalgic episode of one of my favorite comedians narrated shows, I counted 10 commercials, between the intro and the start of the actual program. TEN! There was a time when it took the entire half hour to show 10 commercials. No one over the age of 8 is going to sit through 10 commercials without flipping the channel, going to sleep or turning the tube off all together. No one.
2) People like me, people who watch any kind of television, are rapidly going the way of the dodo bird. Not enough to kill the industry just yet, but enough to see the writing on the wall. And that writing will be page views, event conversions and line graphs. The Internet has brought an evolutionary leap into our lives. I can honestly remember a time when the Internet”, better known as the “World Wide Web”, was there simple to “inform”. Now we depend on it to run our daily lives, our governments, and, yes, our entertainment…in short, its where all those eyeballs are drifting off to…by the hundreds of millions..even billions.
But its the share scope of the Internet that provides both the risks and rewards. With hundreds of millions of eyeballs pouring through the Internet every day, sitting down behind a tube to watch today’s bad news seems superfluous to the average user of the Net, whether desktop, pad or mobile. But the real magic of the Internet isn’t just all those eyeballs, but now we know what all those eyeballs see when they come to any particular website, whether its to read international news, catch up with friends and family on Facebook or buy a new pair of shoes. It may seem a little creepy but actually its not, it just gives the new generation of “techies” that more to play with…data. Data, data everywhere. We’ll get to that subject in another article to come.
Posted by admin On May 5, 2016
Just read a very interesting article in Advertising Age on the “Agency of the Future”. I couldn’t help but find it fascinating since I had held a Ted talk at JWT last year with the exact same title and theme. However, my own speech was a lot more mundane compared the the high–falutin‘ spiel an Advertising Age writer can spin. After all, it is the advertising business so one comes to expect a reasonable amount of razz-matazz, but this article really laid it on thick. The basic giste of my own premise was, like any other business, technology is here to stay, not only to stay, but eventually to rule. Usually that means fear and panic to all those “non-techies” out there. This article is no different.
Following are some direct quotes from the article:
“We’re looking for a higher degree of consolidation to make integration and interdependence more effective”
“There will be dedicated client teams and a greater degree of open-sourcing of talent and capability”
“The most effective creative will come from the integration of content creation and distribution, and greater in-house content publishing resources.”
Blah, blah, blah…
When advertising executives start using this many buzz-words and neo-cliches it can mean only one thing…run for the hills before the smoke clears boys…the jig is up! In other words, to quote, they are “…predicting a 25% reducution in head count for holding companies in
the next 5 to 10 years as a result of the ‘power or automation’ in content creation and distribution and the impact of artificial intelligence on adminstrative roles”. In plain english, tech is taking over, kids, get with it or get out. Its ironic, that advertising execs are starting to sound like rust-belt execs 20 years ago. That’s no coincidence. We are coming to the end of one era and the beginning of another. The smug, aloof Madmen of the past are slowly, but ever so surely, being replaced by smug, aloof “techies” secure in there insular knowledge of systems, servers and code. I’ve written about this before and I have the feeling I’ll be writing about it for some time to come. Almost 7 years ago I wrote about the natural phenomena of “creative destruction”. In short, as a new industry arises (usually due to new technology) old ones fall by the wayside.
In that article, I try to point out that in the 20th century it was a relatively slow albeit, inevitable, process of an entire industry dissolving into others that generally takes a generation or two to complete. But in the 21st century those processes have been speeded up considerably by the “digital age” and the expectations that come along with it. What all the above Ad Age double talk really comes to is that the “old” way of life in advertising is coming to a close…big agencies, big budgets, big dinner checks is coming to a close and a whole generation, maybe two, better start either taking as many General Assembly tech courses as they can or start working at Starbucks.
I’ve been a “techie” my whole career, in a variety of businesses, not just advertising, I know first hand what a closed in, self-purporting little group of anti-socials “nerds” can really be. They can’t help themselves, in a knowledge economy its natural for an “I know more than you” psychology to thrive among the techies themselves. You can imagine what manifests itself when “hard-core techies encounter “non-techies”, bean-counters and anyone over 30. This will eventually include C-level execs, boards of directors and the rest of the human race as a whole. Trust me, I’m not exaggerating. Soon it will be 20-something content producers producing content for 20-something content consumers. Not bad if your a 20-something developer who can explain how to code a binary tree or a hash table. But not so much for everyone else.
As in any culture to much “in-breeding” is not a good thing. Eventually, when we have a culture dominated by 30 year old men in tea shirts and sneakers buying and selling to each other, when every form of media takes the form of a video game or something equally juvenile, and programmatic becomes automatic, hopefully, there will still be some spark of actual creativity, maturity and human emotion. Yes that’s always been a lot to expect from advertising in general, but language has a way of turning into actions and cultures have a way of turning into reality. Cultures are only a group point of view, but from the inside out, they are as tangible as the warmth from the sun. However, cultures also have a tendency to be somewhat myopic and jingoistic. Western culture invented phrases like “Third-world” and “Red Menace”, but when closely examined, are just perspectives that suit those who are comfortable with such divisions. Unfortunately, most “techie” cultures aren’t warm and friendly. They are usually competitive, cliquish and harsh, Google not-withstanding. But more and more, it is that culture which is beginning to prevail in corporate culture, social culture and, yes, Western culture as a whole. Finally, the most ominous quote from the article for me:
“People who understand data and omnichannel ulitmately become the most responsible custodians of a company’s money and how to spend it”
There was a time when understanding “people” was the key to commercial and, sometimes, even personal success. Yes, it truly is the end of an era.
Posted by admin On March 6, 2016
If I were a a sci-fi writer or a Norwellian social critic I would call the 21st century the “Rise of the Techie” and, indeed, it is. But I don’t think the societal structures currently in place have any idea what that will mean to the “normal” day to day life of Jane and John Doe. From my point of view in advertising and now ad-tech, these trends are much easier to see coming than for an average “techie”. “Techies”, whatever that term ACTUALLY means, are seen as the most unaware people in the world, focusing on “apps”, and “product cycles” and “agile configurations” . Nomenclature common in today’s trendy world of millennials starting up a brand new “hi-tech” start-up every day. The cultural language of our time is full of terms like “social network”, “gaming”, “netflix binging”. We hear it every day, but somehow its still something ethereal, mystical and beyond the realm of “normal” people.
“It’s a digitial world and I am just a digitial girl…!” -revised Cyndi Lauper
As an added consequence anyone over 35 is immediately ejected from the “in-crowd”, since unless they’re the ones financing those brand new “hi-tech” startups.. they just don’t “get it”. Which is ridiculous but accepted in the “Ender’s Game” culture we live in. But there are exceptions. In the recent movie “The Intern”, Robert Dinero does his usual credible job of portraying someone we all want to be (and, of course, aren’t) guiding the trendy, beautiful “founder” of an ecomm company, through the perils of both corporate and “real” life. My personal feelings aside, the reason I mention this in this article is, although she was the CEO of a “digital” company, should could do everything from customer service to showing warehouse workers how to fold clothes for shipping…but there was one thing she DIDN’T do…code! That was left to the nerdy oddballs straight out of the Big Bang Theory. Grown up men who dress and act like boys, finally learning from a paternal DeNiro, the dignity and manly grace behind wearing a tie or carrying a handkerchief. Well, whatever… I may never understand why all these stereotypes are so wildly popular but that aside, there is some reality behind all dreams.
In my previous post The Great “Talent” Show I expressed my dismay at how techies, developers, in particular, are somehow always the forgotten man in technology’s Broadway show. Sort of like putting a lot of hype into Wolfgang Puck’s brand new 4 star restaurant without hiring anyone to cook the food. American culture seems to be one of downplaying whats truly importan for whats shiny, bright and new. A Peter Pan culture that likes when nice things happen but don’t always want to be responsible for making them happen. Its like having a century of fossil fuel based economic growth, without anyone ever once mentioning, “What happens when the fuel runs out?”. We’re currently in the business of creating a global technology economy, an economy based on code, trillions of lines of code. But who’s going to write all this code?
What fossil fuels were in the 20th century, code will be in the 21st century…the fuel that runs the economies of every country in the world, 1st, 2nd or 3rd world, it will not make any difference. The demand increases every day, will the supply keep up? In the end, all those bits and bytes are put there by people. Perhaps some day, there will be programs to write programs. Who knows? Time will tell. But the good news is, as if by divine decree, more and more tools to create all this digital nirvhana are being created every day. And with the power of the internet anybody who has access to the internet can use them.
Just the last 5 years alone, I personally have seen, not just technology leaps like faster chips, or cooler iphones but “software tools” that make all that hardware more useful, more relevant and more valuable. Yes, all the Dockers, NodeJs’s, Githubs and the like are creating a modern renaissance in software development…a “Golden Age” we are just now embarking on. In the 20th century, it was C, C#, Java, perl and php that were cryptic, mysterious and hidden, but provided the fuel that created the economic miracles of their day, Apple, Microsoft and the ever amazing Sony Playstation. The difference? Not just semantics or even code. If you look deeply under the hood of the “new” stuff we find ourselves still surrounded by C code and “Bash” shells, just like 1990! But in 1990, to even know what this stuff even meant, much less be able to use it, you could tinker in your garage with parts from Radio Shack, but unless your Jobs and Wozniak, that was “unprofessional”. Generally you had to have a clear and distinct interest, major in “Comp Sci” in college just to get behind the screen of a “mainframe” or “mini-computer”. Then you had to get that first job in “IT” as an analyst, technical support engineer or even, yes even then, a programmer. The typical interview question then, “You know C? When can you start?”
Now any 12 year old can start his school science project on Github, pull in a Node module and create a mobile app to the keep the scoring average of his favorite NBA players on a day to day basis. I don’t know whether anybody out there is doing or will be doing such a thing but I do know that they CAN do it! Anywhere in the world at any time! And that is where all of our futures lie. In the hands of not just “developers”, “techies” and “propeller heads”, but little johnny and jane who program as naturally as they drink milk and outgrow their clothes…openly, without stigma, ridicule and prejudice. In a world full of consumers SOMEBODY has to be the producers. And I’m betting it won’t just be the Stanford scholars and drop outs in Silicon Valley. I’m betting it will be the kids in Sao Paulo, Nairobi and Manila.
Think about it…
Posted by admin On October 12, 2015
As I watch the digital landscape change around me, not just for advertising and/or “ad tech” but the entire world of business itself, I still find it amazing how technology itself has become gentrified. As the new generation of devs replace the old “IT” guys, Its almost scary how quickly knowledge is discarded for youth, something new and the myth that the ability to code is linked to some mystical genius gland that some have but most don’t. Coding is both a skill and a talent and develops itself as uniquely as the individual that bears it. Just 20 years ago corporations were the the mainstay of international business and as such, called for a certain conformity in order to belong to the “corporate” culture. Today that is all but dead but for some of the oldest, largest corporations.
All businesses are now being “de-centralized” by technology, as I said, not just advertising ALL businesses. Which means, most likely the 20th century “corporation” as we know it will eventually disappear, re-invent itself, or become unrecognizable. In my mind, this is not necessarily a “bad” thing, but change on that scale usually means a lot of economic disruption goes with it. With economic disruption comes social and political disruption. Disruption can be revolutionary, so its my opinion we, as a society, a culture, are in the midst of a full-fledged revolution. To be honest this isn’t really something new, its been going on for quite a while, but its interesting to see how “milleniels” spin it as something of their own creation. But in the end somethings never change… the “geeky” new GE developer that can’t get his friends or family to take him seriously because he doesn’t program a video game or lift a sledge-hammer.
As you may have guessed I’m no millennial but I remember going thru the same indolence my entire career. Over the last 20 years I’ve worked in every vertical from foreign exchange, publishing, politics, tv and advertising. Advertising, I hope being the last leg on that long journey…but it was only until advertising that peoples eyes stop glazing over when I tell them what I did. Most still aren’t sure exactly what it is I still do but if its “advertising” it must be cool! Point being, tech has been around for 50-60 years, and as long as its been around, it has NEVER, EVER been cool. Until now. From the Bill Gates to the Zuckerberg era, something has dramatically changed. The rise of the millennial has lead to the rise of the entrepreneur! Every “millennial” I meet has it all figured out; the great idea, the right investor, build the team and ouila! instant millions! Its the IDEA that is the diamond in the rough, ways to tell your friends when you’re brushing your teeth or eating a pizza! The actual technology on how to do this is usually almost an afterthought. The people who are going to build these ground-breaking, multi-million dollar block buster businesses are at the very bottom of the list. This I have never been able to understand.
A good developer, I mean a REALLY good developer, is worth twice his weight in gold. Without the tech nerds, white or black, male or female, old or young, to churn out that living code, there would be no app, no business, no millions. But for some odd reason, they’re still the last item on the list, as if, when the time comes for the code monkeys to do their thing, someone hops in a pick up truck, drives to a corner at 6 in the morning and says “you, you and you” to a group of ne’er do well men, anxious to earn enough to eat for the day. I just don’t get it. Yes, I know when you read ads for developers, programmers and architects you see 6 figure salaries and alluring perks. There was a time in Silicon Valley when top of the line devs got signing bonuses the same as professional atheletes! But thats definitely a “west coast” thing. Here on the “east cost”, and, in particular NYC, the demand for good devs is through the roof, but the process of finding them, itself, is bordering on the absurd.
The problem to me is a simple one. There is quite a disparity between the people that need the talent and the talent itself. In the old days, if you were a brick layer for instance, you would work you way to becoming a contractor and then YOU would hire brick layers. In todays era, some techies give up coding for the “big office” but only after they pass their trial by fire on the front lines, and then not often enough. Usually the “idea” guys are young millennials whose only desire is to get rich by the time they’re 30, retire at 40 then become VC’s themselves. For developers, well they simply cease to exist past the age of 35. They either die on the spot or dissolve into some ethereal state only to be reincarnated into the next generation. When I was a kid I wanted to be a fireman. I never realized there are no 50 year old firemen except the ones that ride in cars…not the cool red trucks.
In any case, the technology world is rapidly becoming filled with more fallacy than functionality. In the “real” world, New York investors, visionaries, millionaires of the future, see developers/programmers/techies, the way they see any other job on the job market, with a fixed idea of what they do and how they do it. Especially if they they aren’t the ones doing it! If you’re a bank teller, you interface with customers, complete transactions, and count money. Simple. You interview a bank teller on how many years he’s interfaced with customers, completed transactions and counted money. If he/she answers the question correctly and have a nice smile they’re hired…simple enough.
The current fad is for “sweatshop” development, where you fill a room with “programmers” and just go at it for 50 -60 hours a week just keeping up with the market. The market being whatever vertical you happen to be in, games, ring tones, tech ads, ecomm, it doesn’t matter. Throw bodies behind a computer, assess whose doing the most work over a period of time, keep the high performers (before they jump ship) and let the rest go seems to work for awhile. But it cannot last. Programming is a talent and like all true talent it is made more important for its rarity not its commonality. A good developer is NOT a commodity and should be treated as such. They’re not just part of the business they ARE the business!
Posted by the kid On May 6, 2015
Matchmaker, Matchmaker make me a match! There is a new wave of technology waiting to explode onto the world. We’re already familiar with it to some degree, especially the hard core “geeks” who write all these programs that are rapidly taking over the world. Anyone that calls themselves a programmer cannot function without a good string “matching” function in their favorite text editer. In this case, “strings” of letters and characters that can be found among thousands of lines of otherwise indecipherable code. But you don’t have to be a programmer to exploit this particular technology. Every time you type into that simple Google search box the process begins. Behind that little white box is a “black box” of code, algorithms and mathematics to match your quest with information, images, videos, whatever, from all over the globe, 24 hours a day. But the most familiar applications of match making are the good old dating sites…eHarmony, Match.com, OKCupid, Ourtime… all wildly successful for bringing hopeful A together with lonely B. Before these sites, if people couldn’t “match” up face to face (God forbid) then they had to rely on hours and hours of Craigslist “personals”. Even further back, during the technical Dark Ages before the Internet, this laborious task would have to be done scouring through local newspapers, if not local bars! The hit or miss of an SBM searching for a WJW for some frisky RNR was a risky proposition at best! Now, however, the “Secret Sauce” these dating sites provide all but takes the risk out of it, provided of course your match isn’t using a 20 year old photograph taking during “better” days. Well you can’t remove all the risks!
The point is, the boom of matching technology is moving beyond bringing A and B together for love, romance and “other things”. The “Secret Sauce” is actually based on deterministic mathematical proofs that can be programmed and predicted. Ironically, one such algorithm is known as the
“Stable Marriage” theorem (https://www.youtube.com/watch?v=5RSMLgy06Ew). This can explain some of the success of the dating sites. In other words, the”Secret Sauce” actually works! But these same algorithms need not be restricted to dating sites. Now we can match you with your new home or apartment (apartment.com) or even which hospital aspiring doctors should do their internships and residencies (nrmp.org). And even more infamously, Wall Street, where “programmatic trading” has been the cause of many a fortune won and many a fortune lost. But the newest and most rapidly changing frontier is, yes, you guessed it.. advertising! In my previous article on “programmatic advertising” (http://sonyainc.net/wordpress/?p=379) I talked about the rising trend of letting computers do all the hard work of advertising on computers. Makes sense but computers are computers. They can only display ads not react to them. This is still the realm of humans and much to the relief of many humans who have built careers in advertising, computers aren’t ready to take over just yet. There still has to be a “human” factor added to the infamous, automated RTB (Real Time Bidding) that’s taking the industry by storm. That factor is the appeal of content being matched with the pocket books of the advertiser. Some of this is automated, of course, that’s the point of programmatic advertising, but humans still have to make SOME choices. What networks to use, what kind of audience is being sought, how much money should be spent. But so far, programmatic advertising seems more concerned with advertisers and publishers than about us, the consumer.
Lets step back a little bit. It is no coincidence, that a few of the same names that started out on Wall Street developing algorithms for programmatic trading have found themselves pioneering programmatic advertising but with one major difference. Financial markets are seen as “zero-sum”, i.e., if there is $10 on the table if that $10 goes to me it does not go to someone else. In other words, if I win, someone else loses. There’s been a lot of angst and worry inside the industry, especially among “traditional” advertising agencies that “programmatic” also means zero-sum. Actually, nothing could be further from the truth. I see programmatic advertising as a powerfully synergetic force…almost frighteningly so. Put it this way, I am a publisher, I have “content/audience”… advertisers want to reach my audience through my content and I want those advertising dollars to fuel my content to build my audience. A win-win situation you say? Perhaps. But what about the poor consumer already inundated with thousands of ads, solicitations and annoying offers! Argh!
I’m Joe Consumer, what does “programmatic” mean to me? I do a quick search one day on an expensive watch just to see how expensive expensive can be. I have absolutely no intention of actually buying the thing (unless my startup is about to go IPO, of course). But for the next few days, whenever I’m on Twitter or Facebook or another digital “Ad” network which is flavored by someone’s “secret sauce”, I’m inundated by ads for Expensivo watches. How annoying and creepy! Until I realize I want to buy a writing stylus for my trusty Ipad which, by the way, are not sold by Apple. Darn…where do you buy these things? Hmm…let me do a quick search for “Ipad writing stylus”… All my options come up as usual, now I can take my pick… or not. What’s the rush… over the next few days, providing the “secret sauce” is working, I only need to do what I usually do on Facebook, Twitter and what have you. Stylus ads start popping up all over the place.
In other words, ads aren’t so bad if they’re ads I actually want to see. Mind you, I still wouldn’t mind an “off” button, but even I have to admit I find ads for products I’m genuinely interested in much less annoying than the thousands I’m not. I’m also still no fan of boundless commercialism but as long as we live in a consumer society I don’t mind having my own choices as to what I consume and just as importantly…where I can go to consume them.
Posted by admin On March 5, 2015
The relationship between advertising and technology has always been a tenuous one. The two have gone hand in hand since the invention of the printing press, but its always been an odd partnership, both symbiotic and dysfunctional at the same time. And now, well into the 21st century it has become only more so. Advertising relies on “communication” to identify, influence and even coerce the human psyche to buy, or not buy, material and spiritual goods, relying on eye catching visuals, comforting stories and catchy ideas. Technology provides the “media” to bring those visuals, stories and ideas to the sometimes willing, sometimes reluctant masses. In advertising you can never have enough eyeballs. However, it appears technology advances faster than marketing ideas…sometimes too fast.
As we’ve progressed from graffiti to newspapers to radio to television to the internet, the idea has always been the same…the more eyeballs the better. But all the way to the internet the medium was only
the conveyer of the message, a message that usually stood separate in and of itself. Before the internet, media was passive. The masses read, listened, or watched, whatever, and what they did then was unpredictable and random. The goal of the advertiser was only to deliver the message, after that it was up to the advertising gods and human nature to determine the actions of the individuals that consumed the message. But with the rise of the internet came a breakthrough in advertising science…the Banner Ad!
Its been 20 years since the invention of the banner ad, proudly displayed by the then innovative AOL. The idea being that with this new, more personal media that could not only reach the masses but now can actually capture their responses, reactions or revulsion, the Rubicon had been crossed. Advertising has been in the “digital” age ever since. Don’t scoff, the original banner ad had a whopping 44% click through rate, meaning that a little less than half the people that saw it on their computer clicked on it. Usually just out of sheer curiosity. 20 years later a click through rate of 1.5% is considered successful. Why?
Well like any new found power, the last 20 years has been spent on squeezing as much juice out of the advertising banner ad as humanly possible. The banner ad has infiltrated and dominated internet advertising ever since. Advertisers love anything that can be measured, quantified and packaged in the never ending search for ROI. And why not, if I spend a million dollars on an ad campaign its only fair to assume I should get at least a million and one in return. Why else spend the million in the first place?
Of course, in the real world, its never been that easy. When the first graffiti artist wrote “eruntque comedantes in Joe scriptor” (Eat at Joe’s) on the collosseum wall, he didn’t know exactly how many people would see it, he only knew that a lot more would see it if he hadn’t written it there at all. And if that meant that just one more person showed up at Joe’s because they did read it, then the “ad” could be considered a success compared to the cost, the time it took to write it.
To some degree, up until 1994 that was the general business model of all advertising, everywhere. But now that we’re in the “digital” age, advertisers want to know exactly who did what and where. Its not important that consumers hate banner ads, its important that the 1.5% that don’t hate them can be measured, quantified and packaged. In short, the power of the number is seen as more powerful than the creative idea. I guess… after all it was someones idea to write “Eat at Joe’s” in the first place, but probably not to sit and watch to see how many people read it and how many people actually showed up at Joe’s because of it. That would come later with the invention of the coupon, but that’s another story.
This brings us back to technology. The medium of the internet is no longer passive, but “interactive” . The user can now interact with the advert directly and on the spot. Unfortunately over the last 20 years this seems to have become more important than whether the user wants to or not. Mind you as banner ads have evolved, they have striven to become more informative, more entertaining, more captivating but lets face it, in your average perusal of the internet in general how many banners ads do you click on?
Granted the first 10 years or so of internet advertising was more novelty than science, but as the internet grew with the wider availability of broadband, things started getting serious. More advertisers started diverting more of their advertising budgets to digital advertising as opposed to traditional, print and television, once again chasing those ever increasing numbers. The more eyeballs the better. But this posed a dilemma in determining if all these electronic ads were having any positive effect on all those eyeballs. What was more important, how many people saw the ad (impressions) or how many people “clicked” on it, (CTR, CPM…anything with a “C” in front of it).
The debate was just heating up when Google unleashed “targeted” advertising on the world. Now instead of just flashing as many ads as possible in front of as many eyeballs as possible, specific ads can be shown to specific eyeballs that are much more inclined to respond to those ads. Mind you, this sometimes meant less eyeballs but if it meant more identifiable responses than obviously thats a good thing, right? Well, whether it is or not, Google, a simple technology company whose core business had been providing a search engine that people could use to sift through the infinite volumes of information available on the internet, becomes an advertising power house practically over night! Billions of dollars pour into the technology cupboard and a new science is born…targeted advertising.
In reality targeted advertising is nothing new. Its why you see all those beer commercials during football games, hyundai commercials during the Big Bang Theory and finance commercials during 60 minutes. The only difference is that in television-land the targets are much, much bigger. But there’s the catch. In internet advertising who sets the targets? How big or small should those targets be? How do you target consumers without “creeping” them out with Minority Report intrusion? The rise of digital only advertising has had some measurable success but even these employ the black art of “creativity” to create campaigns, some successful, some not.
The answer? Lets dispense with the creativity and leave it all to science…the rise of Programmatic. What is programmatic advertising? I recently did a search on the Flash ( the superhero, not the software) television series on Amazon VOD. The next time I opened up my facebook page, there was an ad to buy the Flash series on VOD. Why? How? Because a programmatic ad server matched an advertiser’s ad, whoever is selling the Flash DVD, with an ad publisher, Facebook. Who knows how? But I wasn’t creeped out or alarmed. In fact, I was mildly amused. It could be creepy that my interests follow me around on the Net but I did like the fact that at least it displayed something I was genuinely interested in. It was not the first programmatic ad I have seen, but its the first one that I actually liked. It would have a much greater impact, however, if it had appeared on the Amazon site when I really was interested in the topic. But the trend is clear. The technology is there and even though no one quite understands it, including me and I’m in the business, its here to stay.
Posted by admin On October 30, 2014
There’s a new trend going around, still below the radar but becoming much more trendy in the coming years. Lately, the world seems to have fallen into some incredulous, nefarious and almost puerile fascination with Silicon Valley success. Not that its unwarranted. Practically the entire American economy seems to be floating on two big banana boats named Apple and Google. Fine. No problem here. But I can remember when both of these “titans” of the digital age were either fledgling or floundering. As always, we of the media like to sensationalize success by acting like there is no such thing as failure. In truth there can never be any success without failure. But thats not what I want to discuss right now.
I’m more interested in what will happen when the luster wears off and these two “giants” of industry become the US Steele and General Electric of the 21st century. Yes, 100 years ago it was these two powerhouse corporations that were leading the world into the 20th century! They were the Wall Street darlings and the flagships for US of A world dominance. Of course, neither had reached their full potential, and were greatly overshadowed by the downfall of Standard Oil, but these two young but ambitious companies were ready to lead the industrial charge into an ever changing, ever more competitive world. After a catastrophic war, they emerged as the new titans in an over exuberant, overly optimistic world. The rich and poor alike valued their future according to how much stock they owned in either company or both.
“Bethlehem Steel stock rose from a pre-war average of $25 to $700 in 1916“. Sound familiar? As of this writing, US Steel stock prices are around $28 per share and GE around $25. Both considered very “affordable” stocks.
Well, we all know what eventually happened and there’s no reason to assume there will be more wars and depressions in the 21st century (though, unfortunately, they can’t be ruled out either). But the point I’m trying to make is simple. What may shine brightly now, will not last forever. Not that Apple and Google won’t continue to be the technical giants that they are, but the world, especially in technology, is always changing, always evolving, and these days at breakneck speed. Silicon
Valley was not always Silicon Valley. Before the geniuses of innovation rose to acclaim in San Jose and San Mateo, the east dominated the digital landscape. Areas like Route 128 in Massachusetts and the Beltway in the Washington D.C. metropolitan area were the places to be for nerds and capitalists alike. Anyone remember AOL? But that was the age of proprietary hardware and shrink wrap software. Practically prehistoric by today’s standards. But that’s the point. Silicon Valley didn’t become Silicon Valley because California has a propensity to grow technical genius the way it produces oranges. No. It was an early exponent of open systems, open source, and embraced technologies like unix while the rest of the world clung to proprietary, licensed and very expensive operating systems. In short, California techies were simply in the right place at the right time. Granted, it was the Jobs and Gates of the world that knew how to take advantage of it, but the world is a mighty big place.
We are now living in a digital world where communication is the new Bessemer process and the Internet the new electric grid. The Carnegies and J.P. Morgans were the “visionaries” of their day too. But because we are where we are technically, financially and, more importantly, culturally, new visionaries will arise once again on the east coast of the U.S. and, especially one place in particular. The next rival to Apple will be nothing less than the Big Apple. The time has come. Every millennial in the greater New York area has been weaned on the concept, notion, and inevitability of entrepreneurship. No one believes in getting rich by working for a corporation, in fact, to them thats giving up all hope of any future at all. We’ll talk about that in a later article. But you also don’t have to run off for Silicon Valley to make your fortune either. Silicon Alley is growing and growing fast. And this time we’re not looking at an east coast Libertarian society of elitist white males congratulating themselves on their own cleverness (well, at least not in the long run) but a polyglot of cultures, races and creeds, male and female, doing what New Yorkers have been doing since they opened the Erie Canal. Everyone from hard-core street kids to pampered socialites will be pitching, hustling and, most importantly, BUILDING their technical dreams.
True, the first success stories will come from FlatIron and Brooklyn’s Dumbo, but by the 2020’s, they will be in East Harlem, China Town and Hell’s Kitchen. They will be on the Grand Concourse and Roosevelt Ave. The big Apple hasn’t exploited the enormous technical and entrepreneural talent that has always been here because cultures, like bad habits, die hard. For years New York’s media, finance and banking, and, above all, advertising industries have confused high tech with “IT”. They’re the wonky guys with pencil holders and horn-rimmed glassed people talk to when they can’t get their email or a print out. But now those “IT” guys are developing minds of their own and a whole new culture to go with it. The allure and mystique of entrepreneurship has been seeded by the Larry Pages, Mark Zuckerbergs and, of course, the god of all techie gods, the late Steve Jobs. Nerds are no longer just nerds, but “visionaries” eagerly sought after by the capital markets of the world. And where are the best capital markets you ask? Why, of course, the financial center of America…good old NYC!
Although the “glamour” VCs may still be clustered in the “Valley”, growing ever wealthier on the Elon Musks and Peter Thiels of the world. The older, savvier and more aggressive old school VCs of lower Manhattan are slowly awakening to the infinite potential of “hi-tech”. Venmo, MongoDB, and Etsy don’t flow as freely off the tongue as Google, Facebook and Apple… but eventually the sheer number of startups will create its own center of gravity inside the Empire State. State based incentives meant to encourage innovation, growth and investment that haven’t been seen since the Great Depression will assure a long lasting and earnest desire for entrepreneurs to start, build and sell companies here like it nobody’s business. But unlike the cool, aloof libertarian with their narrow viewpoints and open collars, the New York entrepreneur will be tough, gritty and determined. They will be anxious and open-minded, tenacious and resourceful…in short, true New Yoikers! It may take a decade or two, but Manhattan is destined to be the next Silicon Island, perhaps, once again, the financial and technical pearl of the world.
Posted by admin On October 13, 2014
We live in a society already obsessed with age. Its no secret that advertising begins and ends with the “demographic” and the prime demographic, no matter what the era, is 18-35 years of age. This, quite simply, is the age of “buying things”.
Young adulthood, is the time of long sought-after independence; first jobs, first apartments, early marriage, first children. The idea is simple, if you’re ready to start building a life… you’re ready to start buying. Economically speaking, this makes sense. However, I can’t help but think that the blind devotion to this age group goes beyond the simple fact that the average 25 year old, with their first or second real job, is more likely to buy a pair of jeans than a seersucker suit. But what if I were a manufacturer of seersucker suits and want to sell ever more seersucker suits?
My research shows me that the majority of people that buy my suits are between the ages of 40 and 65. This age group is the core of my business, in today’s jargon, the “Baby Boomers”. But the Boomers are aging and people that age don’t consume, they save.
They save for oncoming medical expenses, retirement, and, yes, retirement homes and funeral services. Not very sexy…not very exciting, in fact, downright depressing. This in spite of the fact, as long as there are human beings getting older everyday, which for the most part, is always the case (how many “unsuccessful” funeral homes do you know of? ), I still want, no, NEED, a larger portion of sales to go to 18-35 years olds. This validates not only my product and my business, but ME as a purveyor of seersucker. I too am trendy. I too am cool!
After all a “demographic” is just a condensation of what was once called a “group dynamic”. Just being a “Millennial” I feel special, not just because I am young and exhuberant but because the media magic which effects me, no matter what, insists on my being special. It saves me a lot of trouble from having to create my own sense of self, or what my parents expect of me, or even what my peers and friends, expect of me. I neatly fit into a world of Bud Light Beer, Levi Jeans, Nike sneakers and my trusty iPhone in my hip pocket. I am now assured of success on my job, success in love, success in life! Its profoundly easy to live one’s dream life if that dream life is laid out before you. All I have to do is buy it. But where does the dream end and the reality begin, and vice-versa? Is there even any real difference between the two?
I hire an ad agency, they do their own research on how “Millennials” feel about seersucker suits, how their peer groups feel about seersucker, and the best ways to “reach” Millenials to share with them the merits, the coolness of seersucker. I could “reinvent” the power of seersucker through social strategies; Facebook pages devoted to seersucker, daily tweets on the latest celebrities and sports stars decked out in seersucker. Perhaps build a “Seersucker and You” app for IOS and Android, that displays numerous options and apparel for men and women alike. Seersucker can be the “Next Big Thing”!
And our “Boomer” generation? What about them? Do they continue to buy their seersucker or now that seersucker is for the “young”, do they abandon it altogether? Do I, the seersucker manufacturer, abandon the aging, unexciting, uncool Baby Boomer? After all we all know that life after 40 just becomes a vast market for American luxury cars, golf vacations and Viagra. There is no veneration of experience and wisdom and other “earthly” values because, quite frankly, there’s no money in it. This is, of course, an exaggeration, but is it really THAT exaggerated?
We are now a culture of brands. According to Wikipedia, Brand is the “name, term, design, symbol, or any other feature that identifies one seller’s product distinct from those of other sellers.” Brands are used in business,marketing, and advertising.
We have become so immersed in the culture of brands and branding that we now routinely brand ourselves, and I don’t mean personal branding, but group branding. We are all millennials, boomers, generation X, Y and Zers, and like any other brand, carry expectations of service, quality and dependability…or not. I hate to say it, but somewhere in all this, where is good old fashioned humanity? I don’t mean the “crunchy granola” sense of humanity but more the simple human respect for those that came before us and those who come after. We now live in a world where each new generation takes credit for “inventing the wheel”, not because of peer pressure, but that’s what they are told. Is our sense of self, really our self, or what we’re told we’re supposed to be, whatever “category” we find ourselves in? Am I better off because I’m a Millennial and not a Boomer? Is my life actually better because I’m the target market? Is my life really better because of my Levi’s and iPhone? And why is it so easy to believe it is? In short, it probably is if I actually think it is. And after all, isn’t that the REAL power of advertising…not changing our minds, but making up our minds?
But the real point that no one really wants to think about, is that in 20 years, it will be the “Millennials” that become the new “Boomers” and whatever generation that comes after them, whatever their “brand”, this generation will become the new darlings, the new “It” crowd, the new Chosen, and the cycle will begin again.
More research, more strategies, ever more complex, ever more compelling ways to sell seersucker to them. And in turn, they will believe, some more than others, that they were the first to discover the “coolness” of seersucker. Remember, seersucker itself has been around since the beginnings of the colonial British Empire, a time when the average Millennial’s grandfathers’ grandfather was buying the new, the trendy, seersucker suit. I bet grandpa’s grandpa thought seersucker was pretty “cool” , too. (Yes, pun intended…)
Posted by admin On July 14, 2014
Time to get back into the swing of things. Over the last few months I’ve been “experiencing” the internet rather than writing about it, i.e., observing it. Being a developer I don’t just get to “see” what’s happening I can make it happen too. However, it is pretty easy to get lost in the trees when exploring the forest, so every so often I have to step back a little and take a look at the big picture. As notably predicted for the last 25 years, not only in my own observations, but countless others, the World Wide Web, better known as the “Internet”, is rapidly becoming the backbone of today’s media…and no, not just social media…all media. But that is not the realization that I’m going to write about. As far as it goes, that is not even a realization as a statement of the obvious. The realization comes from the fact that for the first time the “media” is separating from the “medium” but the messages must still remain in tact.
“The medium is the message” is a phrase coined by Marshall McLuhan meaning that the form of a medium embeds itself in the message, creating a symbiotic relationship by which the medium influences how the message is perceived.
Two hundred years ago, carrier pigeons were the best way to get the news…depending on where you were of course, but the message was confined to the characteristics of the pigeons. If a particular pigeon was not necessarily a home body or fell in love with a lovely little pheasant along the way, it was entirely possible the message could be lost. Technology has been seeking the perfect “lossless” protocol ever since. But to make a long story short to transmit information via pigeon, you needed a pigeon. To send telegrams you needed a telegraph. To read a paper you needed a printing press. To listen to radio broadcasts you needed a radio. To watch a movie, you needed a theatre and last but not least to watch television…well you know where I’m headed.
As the Internet becomes the “Net”, the universal medium that now carries messages, images, text, news, movies, television, you name it, the “media” now has to conform to a multitude of platforms of all shapes and sizes…not just one. But as the medium becomes more universal, everything from your mobile phone to your refrigerator will eventually be attached to the ‘Net’… a good deal of thought now has to be put into the form as well as the substance. Using a telegraph meant clicking the clicky thing, short for a dot, long for a dash…and that’s it. Hopefully, the tapper and tappee at either end had the skills to both send and decode the message. The medium only had to convey that message using its own unique protocol.
Now the protocol is becoming universal.. HTTP.. but the devices are becoming more complex, in other words, its the “clicky” things that are changing, becoming more diverse. Do you want to receive your messages on your laptop, your mobile phone, your television, your refrigerator? All these are possible now and each platform has to be taken into account on its own merits. Each platform, though connected to the web still has its own unique characteristics, functions and abilities than can be taken advantage of when consuming that message. Your laptop, can organize your messages, your mobile can notify you when they arrive, you’re refrigerator can order a quart of milk from shop.com when it “notices” you need it. The possibilities are endless. So are the choices modern web designers/developers have to make. Hence, the continuing rise of the new generation of “User Experience Designers”. The catch phrase for technology professionals in the early 21st century. It is the User Experience Designer that has to figure out, given a single message, how that message will be perceived, viewed, collected and stored not by the medium but by the device attached to the medium, your browser, your tablet, your mobile phone, your toaster…whatever.
The foot soldier in all this will be the hardworking, every-present, ubiquitious “App”, made popular by the Apple eco-system and now sweeping through the universe at light speed. There are apps for everything, keeping in touch, hiding in secret, collecting bitcoins and selling old socks. You name it, there has to be an app for it. In digital advertising, clever slogans and fancy jingles have given way to the “app”. You can’t just sell dog food, you have to have an app that will keep track of Fido’s calorie in-take and weight. The “app” has to be in as much demand as the product you’re selling! That can be a pretty tall order. And from an insiders point of view, before one can even begin to explore actually creating an app, the first and most important question to ask is “Web or Native”?
This has been and will continue to be an ongoing debate for years to come. As the Internet continues to change human culture, all these new devices and capabilities are creating cultures of their own. Its not enough just get a text message, If I have an Iphone I want an Iphone text message. If I’m team android I want an android text message. If I’m on my laptop do I want a Safari or Chrome or (God forbid) an Internet Explorer message? It may seem trivial but its not. Developing apps used to be a simple matter of knowing a technology that was both reasonably popular and that you had or could acquire some familiarity with. Not any more. We no longer deal just in the technology realm alone but the “culture” of technology. Apps can be, but are no longer expected to be, monolithic. Facebook is Facebook on your laptop, but is it still Facebook on your Iphone? On your Samsung? On your toaster?
That’s right, what If I want my toaster to post to my Facebook timeline.. “the toast is done”. That may seem far fetched, but wouldn’t that be a good idea if I were Braun, or more importantly SELLING Braun toasters. Here the question of “Web or Native” goes well beyond just a matter of technology, but to once again returning, in no small way, to the “medium is the message”! Given my 1000 word limit. I won’t go into the debate itself in this article. But it will be the source of articles to come.. .stay tuned! (On the device of your choice, of course!)