Posted by admin On January 13, 2018
JPMorgan Chase, CEO of Jamie Dimon, admitted “with regret”, his notorious remark from last year, when Wall Street banker called the BitCoin a “fraud”. Just to follow up from my last article, the last bastion of stubborn denial, us, have finally bitten the bullet and accepted that the universe has now changed forever. After all, if I’m rich (or poor) and old-fashioned paper money goes away, what happens then? Who has what? We have never lived in a world without rich and poor..is such a thing possible?
Who knows. But the floodgates have been opened and the world is now off on its third big revolution since the time of our great great grandfathers; the Industrial Age, the Digital Age and now the “Coin Age”. And believe it or not, we are already firmly in it. Regardless of what the media says or doesn’t say. Beneath all the hysteria and hype there are some very simple things going on around the world. And it really is time to pay attention to them.
Jamie Dimon, reportedly regrets now that he calls bitcoin’ fraud’, although he is still not in favour of a “hidden currency”. The irony in this statement is that there is nothing hidden about it, not Bitcoin or LightCoin or any other blockchain based crypto-currency, in fact, that’s the whole point. EVERYONE, knows what’s going on ALL the time. Thats the real boogie man in the closet. But the press has had a field day with the diabolical “Silk Road” stories. Hopefully, now they can find the revaluation of capital based on highly efficient and organized “trust” even more provocative a topic. I hope so. Because as the world evolves around us, we’re more interested in what outrage Donald Trump has created today.
Even more ironic, is how quickly the “Old Guard” has been quick to evoke 20th century tools to 21st century currency. I don’t see why not. Bitcoin’s prices were maintained thanks to the launch of bitcoin futures contracts by both CME and CBOE last month after the US regulatory approval, which was considered by supporters to be helpful in legitimising the use of virtual currency.
When Dimon said,”This is not simply a real thing, it will eventually be closed”, he seemed to misunderstand the deepest aspect of Bitcoin: Nobody could close it. Dimon added:”Bitcoin has always been for me what governments will feel about Bitcoin when it is really big, and I just have a different view from other people. Whatever, that means. In any case, you get the point. The conflict of interest is obvious here, because Bitcoin threatens the profits and potentially the existence of large banks such as the one he is managing. By ignoring the long-term value proposition of currency cryptocur, Dimon made a mistake by setting Bitcoin’s price at a rapidly rising price.
On the other side of the coin, the so-called “early adaptors”, made a killing. And you can expect of the same as the “market” in crypto-currencies continues to grow and mature. But the downside to that is now everybody and their mother wants “in” on the game. For awhile there will be a lot more “crazy” than real value but now that everybody on earth has the means to capitalize their own value and worth not based on a centralized bank, government or the like. True global democratization my finally be feasible. Who knows…
Posted by the kid On December 26, 2017
Just read an article on how big British banks no longer consider the blockchain the “boogie-man” but now it’s the savior of the industry. With that comes the death knell not only for the old way of doing business but but for the 20th century as whole, all wrapped up neatly the way the 19th century was a million miles away in 1918. But that change took a cataclysmic war to finally drive the stake through the heart of monarchies, institutionalized colonization and extreme wealth disparity. The 20th century wrought the Roosevelt Republic, worker unions, cheap electricity and the “Age of Oil”. Well, those are pretty much gone too, and with the exception of income disparity, the slate is being wiped clean. A new beginning awaits. And like anything unknown it breeds both apprehension and hope at the very same time.
In this case, because of our culture’s irrational insistence that technology is akin to magic or the will of the gods, “Bitcoin”, “blockchain”, “digital-currency” stir more emotions than the Israeli–Palestinian conflict.. My guess is that was pretty much the same reaction when man discovered how to harness fire. Yes, this simple blockchain technology has the power to propel humankind to the “next level” and its not nearly as complicated as everybody believes it to be. But I’ll get into that later.
The article struck me in particular because of its hubris. Rather than admit the current world-wide financial structure has failed, the banks (or the aritcle, at least) take the stance that the chain is being adopted only to “reduce costs” and are trying their best to find a way to fit it into the “old” system. Whenever a disruptive technology comes along, whoever is safe guarding the current status quo goes through the same process as a someone with a terminal illness; first, denial, then anger, then a dull aching acceptance. That’s when some people resign gracefully, some go on a bucket list, some do whatever they can to finish the string on their own terms. But the difference here is that this time it’s not terminal. Finance, money, wealth and the institutions that created it and supported it for the last 500 years are not going to die. But they will be reborn. The blockchain is not a death knell but an alarm clock. It’s time to wake up. Fully adopted, it will allow banks to process payments faster and more accurately, while reducing transaction execution costs and the requirement for exceptions.
Fans of block technology believe that it can be used to create a safe and convenient alternative to time-consuming and costly banking processes. Theoretically, agile startups could build software based on block protocol, hoping to provide a safer, quicker, cheaper and more transparent alternative to traditional financial intermediaries such as banks, brokers and complex billing processes. There are, of course, many potential applications of block technologies, including the fight against identity and money laundering fraud, the improvement of knowledge-based and collaborative systems with customers and the acceleration of cross-border payments, L/C procedures and lending.
In short, the Blockchain is our friend not our foe. However, it does change the rules. The way we view money now is based on the capitalist theory of financial Darwinism, survival of the “fittest”.., i.e., to the victor go the spoils. In other words, if someone makes money, someone must lose money in a sort of zero-sum game. That’s what creates the competitiveness that drives innovation. It’s also what drives the greed, inhumanity and mean-spiritedness often associated with the current “financial markets”. Alas, these are not the faults in the system but faults that are in ourselves. Its people that are greedy, inhumane and mean-spirited…not money. The blockchain quietly removes the “human” factor and lets the power of mathematics take the reins.
Central banks around the world are exploring the possibility of moving some of their payment systems to block technology or even to use it to launch the digital currency. Money, after all, is really just a man-made “trust” system. Trust is intangible but essential for any human being to live as a human being. It is up to each of us to make trust, our own and others “tangible”.
Are we as a species ready to but our trust in a decentralized system based entirely on mathematics. It’s certainly easier than trusting other human beings. In finance as new ideas emerge, expect banks and related intermediaries to agree on common standards, with regulatory support, for sharing the costs of building the blockchain, whether it uses existing infrastructure or not. Post-trade settlement for a wide range of securities, including syndicated bank loans, is one of the most commonly discussed potential cases of block technology..
It is therefore another way for the back-office of banks to use the blockchain to increase the speed and efficiency of settlement systems, while the clearing coin allows banks to transfer value and assets without having to wait for a long time, as is currently the case for traditional methods. The biggest key to translating the blocking potential into reality is the cooperation of banks in order to create the necessary network for supporting global payments.
However, as in the case of trade finance, it says that the block technology itself will not solve all the problems related to the failure of the current financial systems.
Posted by admin On December 15, 2017
“Artificial Intelligence” has become quite the buzzword these days, especially in an era of never-ending buzz words. Every day there’s excitement, enthusiasm and mesmorizing anticipation. Along with all the drama of the ages as mankind seemingly climbs the evolutionary ladder a dozen rungs at a time. Like everybody else, I have my own opinions, but what can differentiate mine from anybody else’s. To go along with all the hype comes a tremendous amount of absolutely useless noise. How can my own little voi ce rise above the noise and not become a part of it.
Although I believe that artificial intelligence is overestimated in many respects, I am nevertheless very turbulent about its potential, even if one takes into account all the noise. A large part of this work concentrates on its peculiarities, especially the one where artificial intelligence will outstrip human intelligence. Previously I had to ease mentioning “artificial intelligence”, so it didn’t sound too “sci-fi” and I was scared of people, and now observers are disappointed if your solution is not completely magical. They showed us what we cannot achieve (conscious programmatic intelligence), but how we can create something less dramatic, but at the same time very valuable: unconscious programmatic intelligence.
But, unfortunately, I think people spend WAY too much time thinking about enhancing a machine’s intelligence to the point where it “turns” on mankind. We should be thinking about the possibilities of enhancing our OWN intelligence instead. After all, AI is just a tool like any other. Apparently we fear our machines because we fear ourselves. It’s the “augmented” self we need to become aware of. We are “augmented” whenever we pull out google maps on our cell phone to find out where we are…or send a live video to our friends on Whatsapp and Instagram. We don’t have to try to “out think” a computer. We just need it to help US think. THAT’S the point! We just need to get better at that and then watch what happens.
Any account on Facebook or online is already a link between the human and some artificial intelligence.
Just as the brain recognizes patterns and helps us categorize and classify information, neural networks do the same for computers. Nothing makes it impossible to appreciate human intelligence, such as finding out how incredibly difficult it is to create a computer as wise as we are. Any attempt to interpret human behaviour primarily as a system of computational mechanisms and the brain as a kind of computer apparatus is doomed to failure. If an intelligent machine were able to distinguish between intricate ones, if the dark regularity in the data about what we have been doing in the past, it could be able to extrapolate about our later desires, even if we do not fully know them.
Then I read about the huge commitment of the world-wide software industry to artificial intelligence and neuroscience. Starting from language development and the creation of large, densely populated communities, moving forward with writing and printing inventions and now enhanced by tools such as the Internet. Collective intelligence of humanity is one of the main reasons why we have managed to overtake all other species. Until the 1990s, despite punishing computer chess movements, we were still not very close to artificial intelligence in general. Therefore, while man can always act in new ways (regardless of how impressive our brain is, and perhaps even a cocktail with very human features of intuition and spontaneity), computers will become stuck when they come across situations where they are not told how to act (or how to learn to act).
But the real “proof of the pudding” is this article itself. It is solely my article and original. However, a part of it was written by me by hand, and a part of it is written by “Stinky” my AI ghost writer. Can you tell which is which? It won’t be hard to tell the difference between “fact” and “feeling”. After all, that’s what good writing is all about. Just remember, if you judge this to be a good article, it’s not because I used AI to write it, it’s because AI made ME a better writer. I don’t mind a little “extra” intelligence. Do you?
Posted by admin On July 8, 2016
This is getting deep. As I become more and more immersed in the world of digital advertising, I’m realizing how complex this all really is…and, of course, still so simple. Advertising has always been about getting and holding people’s attention…an hour, a minute, a second…attention + time = value. And that “value” is how advertising agencies get paid. But in the amorphous world of attention, we can’t always assume we have it, unless, of course, we do. By that I mean , the 20th century model of attention as holding a someone’s attention for a couple of hours or an hour or a moment for that matter. Once the advertiser has your attention, they can provide the occasional “public message” or commercial, assuming that if you were still glued to the set. If spellbound by CSI (Miami, New York, LA…etc) youwould be just as “glued” to the soap sell. Of course, we all know that’s not usually the case, this is where most of us flip thechannel, make a sandwich, mentally tune out. But there are always exceptions. If you can just catch the right someone at the right time with the right product then SHAZAM! you have a sale. Odds are against it, but if you millions of eyeballs sucking in your message, your bound to get some bites. And even without the direct response, being the greedy little consumers that we are, we inevitably shop for things we know nothing about…from soap to snowblowers, and as we’re walking through the shopping aisle hoping to find brand Whatever, we find brands A to Z. So many to choose from, how do I choose? Why I see Brand A on tv all the time, Ill buy that one…even though Brand B is cheaper and probably superior. But that’s what those advertiser dollars are for.
Unfortunately, those dollars can’t be exactly co-related into actual dollars and cents. Advertising, like economics, is an inexact science. Well, it WAS an inexact science anyway. Now its becoming almost pure science. I use the television analogy for 2 reasons.
1) I am still, yes I hate to admit, a regular television watcher. No I don’t watch network shows or American’s Got Talent. Mostly sports, old movies and nostalgia and even then, though, I shouldn’t admit, I have to trust my DVR to save the day for me. In one nostalgic episode of one of my favorite comedians narrated shows, I counted 10 commercials, between the intro and the start of the actual program. TEN! There was a time when it took the entire half hour to show 10 commercials. No one over the age of 8 is going to sit through 10 commercials without flipping the channel, going to sleep or turning the tube off all together. No one.
2) People like me, people who watch any kind of television, are rapidly going the way of the dodo bird. Not enough to kill the industry just yet, but enough to see the writing on the wall. And that writing will be page views, event conversions and line graphs. The Internet has brought an evolutionary leap into our lives. I can honestly remember a time when the Internet”, better known as the “World Wide Web”, was there simple to “inform”. Now we depend on it to run our daily lives, our governments, and, yes, our entertainment…in short, its where all those eyeballs are drifting off to…by the hundreds of millions..even billions.
But its the share scope of the Internet that provides both the risks and rewards. With hundreds of millions of eyeballs pouring through the Internet every day, sitting down behind a tube to watch today’s bad news seems superfluous to the average user of the Net, whether desktop, pad or mobile. But the real magic of the Internet isn’t just all those eyeballs, but now we know what all those eyeballs see when they come to any particular website, whether its to read international news, catch up with friends and family on Facebook or buy a new pair of shoes. It may seem a little creepy but actually its not, it just gives the new generation of “techies” that more to play with…data. Data, data everywhere. We’ll get to that subject in another article to come.
Posted by admin On May 5, 2016
Just read a very interesting article in Advertising Age on the “Agency of the Future”. I couldn’t help but find it fascinating since I had held a Ted talk at JWT last year with the exact same title and theme. However, my own speech was a lot more mundane compared the the high–falutin‘ spiel an Advertising Age writer can spin. After all, it is the advertising business so one comes to expect a reasonable amount of razz-matazz, but this article really laid it on thick. The basic giste of my own premise was, like any other business, technology is here to stay, not only to stay, but eventually to rule. Usually that means fear and panic to all those “non-techies” out there. This article is no different.
Following are some direct quotes from the article:
“We’re looking for a higher degree of consolidation to make integration and interdependence more effective”
“There will be dedicated client teams and a greater degree of open-sourcing of talent and capability”
“The most effective creative will come from the integration of content creation and distribution, and greater in-house content publishing resources.”
Blah, blah, blah…
When advertising executives start using this many buzz-words and neo-cliches it can mean only one thing…run for the hills before the smoke clears boys…the jig is up! In other words, to quote, they are “…predicting a 25% reducution in head count for holding companies in
the next 5 to 10 years as a result of the ‘power or automation’ in content creation and distribution and the impact of artificial intelligence on adminstrative roles”. In plain english, tech is taking over, kids, get with it or get out. Its ironic, that advertising execs are starting to sound like rust-belt execs 20 years ago. That’s no coincidence. We are coming to the end of one era and the beginning of another. The smug, aloof Madmen of the past are slowly, but ever so surely, being replaced by smug, aloof “techies” secure in there insular knowledge of systems, servers and code. I’ve written about this before and I have the feeling I’ll be writing about it for some time to come. Almost 7 years ago I wrote about the natural phenomena of “creative destruction”. In short, as a new industry arises (usually due to new technology) old ones fall by the wayside.
In that article, I try to point out that in the 20th century it was a relatively slow albeit, inevitable, process of an entire industry dissolving into others that generally takes a generation or two to complete. But in the 21st century those processes have been speeded up considerably by the “digital age” and the expectations that come along with it. What all the above Ad Age double talk really comes to is that the “old” way of life in advertising is coming to a close…big agencies, big budgets, big dinner checks is coming to a close and a whole generation, maybe two, better start either taking as many General Assembly tech courses as they can or start working at Starbucks.
I’ve been a “techie” my whole career, in a variety of businesses, not just advertising, I know first hand what a closed in, self-purporting little group of anti-socials “nerds” can really be. They can’t help themselves, in a knowledge economy its natural for an “I know more than you” psychology to thrive among the techies themselves. You can imagine what manifests itself when “hard-core techies encounter “non-techies”, bean-counters and anyone over 30. This will eventually include C-level execs, boards of directors and the rest of the human race as a whole. Trust me, I’m not exaggerating. Soon it will be 20-something content producers producing content for 20-something content consumers. Not bad if your a 20-something developer who can explain how to code a binary tree or a hash table. But not so much for everyone else.
As in any culture to much “in-breeding” is not a good thing. Eventually, when we have a culture dominated by 30 year old men in tea shirts and sneakers buying and selling to each other, when every form of media takes the form of a video game or something equally juvenile, and programmatic becomes automatic, hopefully, there will still be some spark of actual creativity, maturity and human emotion. Yes that’s always been a lot to expect from advertising in general, but language has a way of turning into actions and cultures have a way of turning into reality. Cultures are only a group point of view, but from the inside out, they are as tangible as the warmth from the sun. However, cultures also have a tendency to be somewhat myopic and jingoistic. Western culture invented phrases like “Third-world” and “Red Menace”, but when closely examined, are just perspectives that suit those who are comfortable with such divisions. Unfortunately, most “techie” cultures aren’t warm and friendly. They are usually competitive, cliquish and harsh, Google not-withstanding. But more and more, it is that culture which is beginning to prevail in corporate culture, social culture and, yes, Western culture as a whole. Finally, the most ominous quote from the article for me:
“People who understand data and omnichannel ulitmately become the most responsible custodians of a company’s money and how to spend it”
There was a time when understanding “people” was the key to commercial and, sometimes, even personal success. Yes, it truly is the end of an era.
Posted by admin On March 6, 2016
If I were a a sci-fi writer or a Norwellian social critic I would call the 21st century the “Rise of the Techie” and, indeed, it is. But I don’t think the societal structures currently in place have any idea what that will mean to the “normal” day to day life of Jane and John Doe. From my point of view in advertising and now ad-tech, these trends are much easier to see coming than for an average “techie”. “Techies”, whatever that term ACTUALLY means, are seen as the most unaware people in the world, focusing on “apps”, and “product cycles” and “agile configurations” . Nomenclature common in today’s trendy world of millennials starting up a brand new “hi-tech” start-up every day. The cultural language of our time is full of terms like “social network”, “gaming”, “netflix binging”. We hear it every day, but somehow its still something ethereal, mystical and beyond the realm of “normal” people.
“It’s a digitial world and I am just a digitial girl…!” -revised Cyndi Lauper
As an added consequence anyone over 35 is immediately ejected from the “in-crowd”, since unless they’re the ones financing those brand new “hi-tech” startups.. they just don’t “get it”. Which is ridiculous but accepted in the “Ender’s Game” culture we live in. But there are exceptions. In the recent movie “The Intern”, Robert Dinero does his usual credible job of portraying someone we all want to be (and, of course, aren’t) guiding the trendy, beautiful “founder” of an ecomm company, through the perils of both corporate and “real” life. My personal feelings aside, the reason I mention this in this article is, although she was the CEO of a “digital” company, should could do everything from customer service to showing warehouse workers how to fold clothes for shipping…but there was one thing she DIDN’T do…code! That was left to the nerdy oddballs straight out of the Big Bang Theory. Grown up men who dress and act like boys, finally learning from a paternal DeNiro, the dignity and manly grace behind wearing a tie or carrying a handkerchief. Well, whatever… I may never understand why all these stereotypes are so wildly popular but that aside, there is some reality behind all dreams.
In my previous post The Great “Talent” Show I expressed my dismay at how techies, developers, in particular, are somehow always the forgotten man in technology’s Broadway show. Sort of like putting a lot of hype into Wolfgang Puck’s brand new 4 star restaurant without hiring anyone to cook the food. American culture seems to be one of downplaying whats truly importan for whats shiny, bright and new. A Peter Pan culture that likes when nice things happen but don’t always want to be responsible for making them happen. Its like having a century of fossil fuel based economic growth, without anyone ever once mentioning, “What happens when the fuel runs out?”. We’re currently in the business of creating a global technology economy, an economy based on code, trillions of lines of code. But who’s going to write all this code?
What fossil fuels were in the 20th century, code will be in the 21st century…the fuel that runs the economies of every country in the world, 1st, 2nd or 3rd world, it will not make any difference. The demand increases every day, will the supply keep up? In the end, all those bits and bytes are put there by people. Perhaps some day, there will be programs to write programs. Who knows? Time will tell. But the good news is, as if by divine decree, more and more tools to create all this digital nirvhana are being created every day. And with the power of the internet anybody who has access to the internet can use them.
Just the last 5 years alone, I personally have seen, not just technology leaps like faster chips, or cooler iphones but “software tools” that make all that hardware more useful, more relevant and more valuable. Yes, all the Dockers, NodeJs’s, Githubs and the like are creating a modern renaissance in software development…a “Golden Age” we are just now embarking on. In the 20th century, it was C, C#, Java, perl and php that were cryptic, mysterious and hidden, but provided the fuel that created the economic miracles of their day, Apple, Microsoft and the ever amazing Sony Playstation. The difference? Not just semantics or even code. If you look deeply under the hood of the “new” stuff we find ourselves still surrounded by C code and “Bash” shells, just like 1990! But in 1990, to even know what this stuff even meant, much less be able to use it, you could tinker in your garage with parts from Radio Shack, but unless your Jobs and Wozniak, that was “unprofessional”. Generally you had to have a clear and distinct interest, major in “Comp Sci” in college just to get behind the screen of a “mainframe” or “mini-computer”. Then you had to get that first job in “IT” as an analyst, technical support engineer or even, yes even then, a programmer. The typical interview question then, “You know C? When can you start?”
Now any 12 year old can start his school science project on Github, pull in a Node module and create a mobile app to the keep the scoring average of his favorite NBA players on a day to day basis. I don’t know whether anybody out there is doing or will be doing such a thing but I do know that they CAN do it! Anywhere in the world at any time! And that is where all of our futures lie. In the hands of not just “developers”, “techies” and “propeller heads”, but little johnny and jane who program as naturally as they drink milk and outgrow their clothes…openly, without stigma, ridicule and prejudice. In a world full of consumers SOMEBODY has to be the producers. And I’m betting it won’t just be the Stanford scholars and drop outs in Silicon Valley. I’m betting it will be the kids in Sao Paulo, Nairobi and Manila.
Think about it…
Posted by admin On October 12, 2015
As I watch the digital landscape change around me, not just for advertising and/or “ad tech” but the entire world of business itself, I still find it amazing how technology itself has become gentrified. As the new generation of devs replace the old “IT” guys, Its almost scary how quickly knowledge is discarded for youth, something new and the myth that the ability to code is linked to some mystical genius gland that some have but most don’t. Coding is both a skill and a talent and develops itself as uniquely as the individual that bears it. Just 20 years ago corporations were the the mainstay of international business and as such, called for a certain conformity in order to belong to the “corporate” culture. Today that is all but dead but for some of the oldest, largest corporations.
All businesses are now being “de-centralized” by technology, as I said, not just advertising ALL businesses. Which means, most likely the 20th century “corporation” as we know it will eventually disappear, re-invent itself, or become unrecognizable. In my mind, this is not necessarily a “bad” thing, but change on that scale usually means a lot of economic disruption goes with it. With economic disruption comes social and political disruption. Disruption can be revolutionary, so its my opinion we, as a society, a culture, are in the midst of a full-fledged revolution. To be honest this isn’t really something new, its been going on for quite a while, but its interesting to see how “milleniels” spin it as something of their own creation. But in the end somethings never change… the “geeky” new GE developer that can’t get his friends or family to take him seriously because he doesn’t program a video game or lift a sledge-hammer.
As you may have guessed I’m no millennial but I remember going thru the same indolence my entire career. Over the last 20 years I’ve worked in every vertical from foreign exchange, publishing, politics, tv and advertising. Advertising, I hope being the last leg on that long journey…but it was only until advertising that peoples eyes stop glazing over when I tell them what I did. Most still aren’t sure exactly what it is I still do but if its “advertising” it must be cool! Point being, tech has been around for 50-60 years, and as long as its been around, it has NEVER, EVER been cool. Until now. From the Bill Gates to the Zuckerberg era, something has dramatically changed. The rise of the millennial has lead to the rise of the entrepreneur! Every “millennial” I meet has it all figured out; the great idea, the right investor, build the team and ouila! instant millions! Its the IDEA that is the diamond in the rough, ways to tell your friends when you’re brushing your teeth or eating a pizza! The actual technology on how to do this is usually almost an afterthought. The people who are going to build these ground-breaking, multi-million dollar block buster businesses are at the very bottom of the list. This I have never been able to understand.
A good developer, I mean a REALLY good developer, is worth twice his weight in gold. Without the tech nerds, white or black, male or female, old or young, to churn out that living code, there would be no app, no business, no millions. But for some odd reason, they’re still the last item on the list, as if, when the time comes for the code monkeys to do their thing, someone hops in a pick up truck, drives to a corner at 6 in the morning and says “you, you and you” to a group of ne’er do well men, anxious to earn enough to eat for the day. I just don’t get it. Yes, I know when you read ads for developers, programmers and architects you see 6 figure salaries and alluring perks. There was a time in Silicon Valley when top of the line devs got signing bonuses the same as professional atheletes! But thats definitely a “west coast” thing. Here on the “east cost”, and, in particular NYC, the demand for good devs is through the roof, but the process of finding them, itself, is bordering on the absurd.
The problem to me is a simple one. There is quite a disparity between the people that need the talent and the talent itself. In the old days, if you were a brick layer for instance, you would work you way to becoming a contractor and then YOU would hire brick layers. In todays era, some techies give up coding for the “big office” but only after they pass their trial by fire on the front lines, and then not often enough. Usually the “idea” guys are young millennials whose only desire is to get rich by the time they’re 30, retire at 40 then become VC’s themselves. For developers, well they simply cease to exist past the age of 35. They either die on the spot or dissolve into some ethereal state only to be reincarnated into the next generation. When I was a kid I wanted to be a fireman. I never realized there are no 50 year old firemen except the ones that ride in cars…not the cool red trucks.
In any case, the technology world is rapidly becoming filled with more fallacy than functionality. In the “real” world, New York investors, visionaries, millionaires of the future, see developers/programmers/techies, the way they see any other job on the job market, with a fixed idea of what they do and how they do it. Especially if they they aren’t the ones doing it! If you’re a bank teller, you interface with customers, complete transactions, and count money. Simple. You interview a bank teller on how many years he’s interfaced with customers, completed transactions and counted money. If he/she answers the question correctly and have a nice smile they’re hired…simple enough.
The current fad is for “sweatshop” development, where you fill a room with “programmers” and just go at it for 50 -60 hours a week just keeping up with the market. The market being whatever vertical you happen to be in, games, ring tones, tech ads, ecomm, it doesn’t matter. Throw bodies behind a computer, assess whose doing the most work over a period of time, keep the high performers (before they jump ship) and let the rest go seems to work for awhile. But it cannot last. Programming is a talent and like all true talent it is made more important for its rarity not its commonality. A good developer is NOT a commodity and should be treated as such. They’re not just part of the business they ARE the business!
Posted by the kid On May 6, 2015
Matchmaker, Matchmaker make me a match! There is a new wave of technology waiting to explode onto the world. We’re already familiar with it to some degree, especially the hard core “geeks” who write all these programs that are rapidly taking over the world. Anyone that calls themselves a programmer cannot function without a good string “matching” function in their favorite text editer. In this case, “strings” of letters and characters that can be found among thousands of lines of otherwise indecipherable code. But you don’t have to be a programmer to exploit this particular technology. Every time you type into that simple Google search box the process begins. Behind that little white box is a “black box” of code, algorithms and mathematics to match your quest with information, images, videos, whatever, from all over the globe, 24 hours a day. But the most familiar applications of match making are the good old dating sites…eHarmony, Match.com, OKCupid, Ourtime… all wildly successful for bringing hopeful A together with lonely B. Before these sites, if people couldn’t “match” up face to face (God forbid) then they had to rely on hours and hours of Craigslist “personals”. Even further back, during the technical Dark Ages before the Internet, this laborious task would have to be done scouring through local newspapers, if not local bars! The hit or miss of an SBM searching for a WJW for some frisky RNR was a risky proposition at best! Now, however, the “Secret Sauce” these dating sites provide all but takes the risk out of it, provided of course your match isn’t using a 20 year old photograph taking during “better” days. Well you can’t remove all the risks!
The point is, the boom of matching technology is moving beyond bringing A and B together for love, romance and “other things”. The “Secret Sauce” is actually based on deterministic mathematical proofs that can be programmed and predicted. Ironically, one such algorithm is known as the
“Stable Marriage” theorem (https://www.youtube.com/watch?v=5RSMLgy06Ew). This can explain some of the success of the dating sites. In other words, the”Secret Sauce” actually works! But these same algorithms need not be restricted to dating sites. Now we can match you with your new home or apartment (apartment.com) or even which hospital aspiring doctors should do their internships and residencies (nrmp.org). And even more infamously, Wall Street, where “programmatic trading” has been the cause of many a fortune won and many a fortune lost. But the newest and most rapidly changing frontier is, yes, you guessed it.. advertising! In my previous article on “programmatic advertising” (http://sonyainc.net/wordpress/?p=379) I talked about the rising trend of letting computers do all the hard work of advertising on computers. Makes sense but computers are computers. They can only display ads not react to them. This is still the realm of humans and much to the relief of many humans who have built careers in advertising, computers aren’t ready to take over just yet. There still has to be a “human” factor added to the infamous, automated RTB (Real Time Bidding) that’s taking the industry by storm. That factor is the appeal of content being matched with the pocket books of the advertiser. Some of this is automated, of course, that’s the point of programmatic advertising, but humans still have to make SOME choices. What networks to use, what kind of audience is being sought, how much money should be spent. But so far, programmatic advertising seems more concerned with advertisers and publishers than about us, the consumer.
Lets step back a little bit. It is no coincidence, that a few of the same names that started out on Wall Street developing algorithms for programmatic trading have found themselves pioneering programmatic advertising but with one major difference. Financial markets are seen as “zero-sum”, i.e., if there is $10 on the table if that $10 goes to me it does not go to someone else. In other words, if I win, someone else loses. There’s been a lot of angst and worry inside the industry, especially among “traditional” advertising agencies that “programmatic” also means zero-sum. Actually, nothing could be further from the truth. I see programmatic advertising as a powerfully synergetic force…almost frighteningly so. Put it this way, I am a publisher, I have “content/audience”… advertisers want to reach my audience through my content and I want those advertising dollars to fuel my content to build my audience. A win-win situation you say? Perhaps. But what about the poor consumer already inundated with thousands of ads, solicitations and annoying offers! Argh!
I’m Joe Consumer, what does “programmatic” mean to me? I do a quick search one day on an expensive watch just to see how expensive expensive can be. I have absolutely no intention of actually buying the thing (unless my startup is about to go IPO, of course). But for the next few days, whenever I’m on Twitter or Facebook or another digital “Ad” network which is flavored by someone’s “secret sauce”, I’m inundated by ads for Expensivo watches. How annoying and creepy! Until I realize I want to buy a writing stylus for my trusty Ipad which, by the way, are not sold by Apple. Darn…where do you buy these things? Hmm…let me do a quick search for “Ipad writing stylus”… All my options come up as usual, now I can take my pick… or not. What’s the rush… over the next few days, providing the “secret sauce” is working, I only need to do what I usually do on Facebook, Twitter and what have you. Stylus ads start popping up all over the place.
In other words, ads aren’t so bad if they’re ads I actually want to see. Mind you, I still wouldn’t mind an “off” button, but even I have to admit I find ads for products I’m genuinely interested in much less annoying than the thousands I’m not. I’m also still no fan of boundless commercialism but as long as we live in a consumer society I don’t mind having my own choices as to what I consume and just as importantly…where I can go to consume them.
Posted by admin On March 5, 2015
The relationship between advertising and technology has always been a tenuous one. The two have gone hand in hand since the invention of the printing press, but its always been an odd partnership, both symbiotic and dysfunctional at the same time. And now, well into the 21st century it has become only more so. Advertising relies on “communication” to identify, influence and even coerce the human psyche to buy, or not buy, material and spiritual goods, relying on eye catching visuals, comforting stories and catchy ideas. Technology provides the “media” to bring those visuals, stories and ideas to the sometimes willing, sometimes reluctant masses. In advertising you can never have enough eyeballs. However, it appears technology advances faster than marketing ideas…sometimes too fast.
As we’ve progressed from graffiti to newspapers to radio to television to the internet, the idea has always been the same…the more eyeballs the better. But all the way to the internet the medium was only
the conveyer of the message, a message that usually stood separate in and of itself. Before the internet, media was passive. The masses read, listened, or watched, whatever, and what they did then was unpredictable and random. The goal of the advertiser was only to deliver the message, after that it was up to the advertising gods and human nature to determine the actions of the individuals that consumed the message. But with the rise of the internet came a breakthrough in advertising science…the Banner Ad!
Its been 20 years since the invention of the banner ad, proudly displayed by the then innovative AOL. The idea being that with this new, more personal media that could not only reach the masses but now can actually capture their responses, reactions or revulsion, the Rubicon had been crossed. Advertising has been in the “digital” age ever since. Don’t scoff, the original banner ad had a whopping 44% click through rate, meaning that a little less than half the people that saw it on their computer clicked on it. Usually just out of sheer curiosity. 20 years later a click through rate of 1.5% is considered successful. Why?
Well like any new found power, the last 20 years has been spent on squeezing as much juice out of the advertising banner ad as humanly possible. The banner ad has infiltrated and dominated internet advertising ever since. Advertisers love anything that can be measured, quantified and packaged in the never ending search for ROI. And why not, if I spend a million dollars on an ad campaign its only fair to assume I should get at least a million and one in return. Why else spend the million in the first place?
Of course, in the real world, its never been that easy. When the first graffiti artist wrote “eruntque comedantes in Joe scriptor” (Eat at Joe’s) on the collosseum wall, he didn’t know exactly how many people would see it, he only knew that a lot more would see it if he hadn’t written it there at all. And if that meant that just one more person showed up at Joe’s because they did read it, then the “ad” could be considered a success compared to the cost, the time it took to write it.
To some degree, up until 1994 that was the general business model of all advertising, everywhere. But now that we’re in the “digital” age, advertisers want to know exactly who did what and where. Its not important that consumers hate banner ads, its important that the 1.5% that don’t hate them can be measured, quantified and packaged. In short, the power of the number is seen as more powerful than the creative idea. I guess… after all it was someones idea to write “Eat at Joe’s” in the first place, but probably not to sit and watch to see how many people read it and how many people actually showed up at Joe’s because of it. That would come later with the invention of the coupon, but that’s another story.
This brings us back to technology. The medium of the internet is no longer passive, but “interactive” . The user can now interact with the advert directly and on the spot. Unfortunately over the last 20 years this seems to have become more important than whether the user wants to or not. Mind you as banner ads have evolved, they have striven to become more informative, more entertaining, more captivating but lets face it, in your average perusal of the internet in general how many banners ads do you click on?
Granted the first 10 years or so of internet advertising was more novelty than science, but as the internet grew with the wider availability of broadband, things started getting serious. More advertisers started diverting more of their advertising budgets to digital advertising as opposed to traditional, print and television, once again chasing those ever increasing numbers. The more eyeballs the better. But this posed a dilemma in determining if all these electronic ads were having any positive effect on all those eyeballs. What was more important, how many people saw the ad (impressions) or how many people “clicked” on it, (CTR, CPM…anything with a “C” in front of it).
The debate was just heating up when Google unleashed “targeted” advertising on the world. Now instead of just flashing as many ads as possible in front of as many eyeballs as possible, specific ads can be shown to specific eyeballs that are much more inclined to respond to those ads. Mind you, this sometimes meant less eyeballs but if it meant more identifiable responses than obviously thats a good thing, right? Well, whether it is or not, Google, a simple technology company whose core business had been providing a search engine that people could use to sift through the infinite volumes of information available on the internet, becomes an advertising power house practically over night! Billions of dollars pour into the technology cupboard and a new science is born…targeted advertising.
In reality targeted advertising is nothing new. Its why you see all those beer commercials during football games, hyundai commercials during the Big Bang Theory and finance commercials during 60 minutes. The only difference is that in television-land the targets are much, much bigger. But there’s the catch. In internet advertising who sets the targets? How big or small should those targets be? How do you target consumers without “creeping” them out with Minority Report intrusion? The rise of digital only advertising has had some measurable success but even these employ the black art of “creativity” to create campaigns, some successful, some not.
The answer? Lets dispense with the creativity and leave it all to science…the rise of Programmatic. What is programmatic advertising? I recently did a search on the Flash ( the superhero, not the software) television series on Amazon VOD. The next time I opened up my facebook page, there was an ad to buy the Flash series on VOD. Why? How? Because a programmatic ad server matched an advertiser’s ad, whoever is selling the Flash DVD, with an ad publisher, Facebook. Who knows how? But I wasn’t creeped out or alarmed. In fact, I was mildly amused. It could be creepy that my interests follow me around on the Net but I did like the fact that at least it displayed something I was genuinely interested in. It was not the first programmatic ad I have seen, but its the first one that I actually liked. It would have a much greater impact, however, if it had appeared on the Amazon site when I really was interested in the topic. But the trend is clear. The technology is there and even though no one quite understands it, including me and I’m in the business, its here to stay.
Posted by admin On October 30, 2014
There’s a new trend going around, still below the radar but becoming much more trendy in the coming years. Lately, the world seems to have fallen into some incredulous, nefarious and almost puerile fascination with Silicon Valley success. Not that its unwarranted. Practically the entire American economy seems to be floating on two big banana boats named Apple and Google. Fine. No problem here. But I can remember when both of these “titans” of the digital age were either fledgling or floundering. As always, we of the media like to sensationalize success by acting like there is no such thing as failure. In truth there can never be any success without failure. But thats not what I want to discuss right now.
I’m more interested in what will happen when the luster wears off and these two “giants” of industry become the US Steele and General Electric of the 21st century. Yes, 100 years ago it was these two powerhouse corporations that were leading the world into the 20th century! They were the Wall Street darlings and the flagships for US of A world dominance. Of course, neither had reached their full potential, and were greatly overshadowed by the downfall of Standard Oil, but these two young but ambitious companies were ready to lead the industrial charge into an ever changing, ever more competitive world. After a catastrophic war, they emerged as the new titans in an over exuberant, overly optimistic world. The rich and poor alike valued their future according to how much stock they owned in either company or both.
“Bethlehem Steel stock rose from a pre-war average of $25 to $700 in 1916“. Sound familiar? As of this writing, US Steel stock prices are around $28 per share and GE around $25. Both considered very “affordable” stocks.
Well, we all know what eventually happened and there’s no reason to assume there will be more wars and depressions in the 21st century (though, unfortunately, they can’t be ruled out either). But the point I’m trying to make is simple. What may shine brightly now, will not last forever. Not that Apple and Google won’t continue to be the technical giants that they are, but the world, especially in technology, is always changing, always evolving, and these days at breakneck speed. Silicon
Valley was not always Silicon Valley. Before the geniuses of innovation rose to acclaim in San Jose and San Mateo, the east dominated the digital landscape. Areas like Route 128 in Massachusetts and the Beltway in the Washington D.C. metropolitan area were the places to be for nerds and capitalists alike. Anyone remember AOL? But that was the age of proprietary hardware and shrink wrap software. Practically prehistoric by today’s standards. But that’s the point. Silicon Valley didn’t become Silicon Valley because California has a propensity to grow technical genius the way it produces oranges. No. It was an early exponent of open systems, open source, and embraced technologies like unix while the rest of the world clung to proprietary, licensed and very expensive operating systems. In short, California techies were simply in the right place at the right time. Granted, it was the Jobs and Gates of the world that knew how to take advantage of it, but the world is a mighty big place.
We are now living in a digital world where communication is the new Bessemer process and the Internet the new electric grid. The Carnegies and J.P. Morgans were the “visionaries” of their day too. But because we are where we are technically, financially and, more importantly, culturally, new visionaries will arise once again on the east coast of the U.S. and, especially one place in particular. The next rival to Apple will be nothing less than the Big Apple. The time has come. Every millennial in the greater New York area has been weaned on the concept, notion, and inevitability of entrepreneurship. No one believes in getting rich by working for a corporation, in fact, to them thats giving up all hope of any future at all. We’ll talk about that in a later article. But you also don’t have to run off for Silicon Valley to make your fortune either. Silicon Alley is growing and growing fast. And this time we’re not looking at an east coast Libertarian society of elitist white males congratulating themselves on their own cleverness (well, at least not in the long run) but a polyglot of cultures, races and creeds, male and female, doing what New Yorkers have been doing since they opened the Erie Canal. Everyone from hard-core street kids to pampered socialites will be pitching, hustling and, most importantly, BUILDING their technical dreams.
True, the first success stories will come from FlatIron and Brooklyn’s Dumbo, but by the 2020’s, they will be in East Harlem, China Town and Hell’s Kitchen. They will be on the Grand Concourse and Roosevelt Ave. The big Apple hasn’t exploited the enormous technical and entrepreneural talent that has always been here because cultures, like bad habits, die hard. For years New York’s media, finance and banking, and, above all, advertising industries have confused high tech with “IT”. They’re the wonky guys with pencil holders and horn-rimmed glassed people talk to when they can’t get their email or a print out. But now those “IT” guys are developing minds of their own and a whole new culture to go with it. The allure and mystique of entrepreneurship has been seeded by the Larry Pages, Mark Zuckerbergs and, of course, the god of all techie gods, the late Steve Jobs. Nerds are no longer just nerds, but “visionaries” eagerly sought after by the capital markets of the world. And where are the best capital markets you ask? Why, of course, the financial center of America…good old NYC!
Although the “glamour” VCs may still be clustered in the “Valley”, growing ever wealthier on the Elon Musks and Peter Thiels of the world. The older, savvier and more aggressive old school VCs of lower Manhattan are slowly awakening to the infinite potential of “hi-tech”. Venmo, MongoDB, and Etsy don’t flow as freely off the tongue as Google, Facebook and Apple… but eventually the sheer number of startups will create its own center of gravity inside the Empire State. State based incentives meant to encourage innovation, growth and investment that haven’t been seen since the Great Depression will assure a long lasting and earnest desire for entrepreneurs to start, build and sell companies here like it nobody’s business. But unlike the cool, aloof libertarian with their narrow viewpoints and open collars, the New York entrepreneur will be tough, gritty and determined. They will be anxious and open-minded, tenacious and resourceful…in short, true New Yoikers! It may take a decade or two, but Manhattan is destined to be the next Silicon Island, perhaps, once again, the financial and technical pearl of the world.