There is an old story about a French nobleman who sat quietly on a tufted sofa while his wife, the Marquise, was in the other room having a baby.
He sat there, probably polishing his fingernails or doing something equally idle and aristocratic, until the nurse entered the room bearing what turned out to be triplets. The Marquis was delighted.
“Please give my thanks and compliments to Madame,” he told the nurse. And then, pointing to the infant on the left he said, “And tell her that I choose this one.”
It’s a funnier story if you use a ridiculously over-the-top French accent, but the point is the same: some people just like choosing from among available options. The chooser almost always has the upper hand.
In Hollywood, the chooser is the executive in charge of picking which ideas for television shows will move forward to multiple episodes. Of course, there are a lot of other considerations at play – whether or not a big name star can be added to the mix; the track record of the creative team – but at a certain point, every powerful Hollywood executive sits on the equivalent of a tufted sofa and points to the baby he or she wants to keep.
The other babies – scripts, ideas, books, notions, even expensively-filmed television “pilot” episodes – are whisked away to an unknown fate.
This week in New York, each television network announces its upcoming slate of projects to the advertising community in carefully-staged, elaborate productions. Each presentation is the culmination of a year’s worth of deal-making, wrangling, double-crossing, and that most human of all emotions: hope.
The assembled executives – the choosers, in other words – have chosen the television series they’re going to be spending billions of dollars to produce, and they’re now trying to gin up some enthusiasm from another group of choosers, the advertisers. That’s the problem with being a chooser: no matter how powerful you feel, there’s always another chooser just above you.
Executives in Hollywood rely on a lot of data to make their choices – market testing results, audience surveys, that sort of thing – but it basically all comes down to human intuition. No one really knows, or can predict, which shows will be global blockbusters and which will be quickly forgotten. The choosers are all human, which makes them imperfect and flawed.
But what if the choosers aren’t human? Facebook – that ubiquitous, always-on, always-watching social network - has a complicated array of algorithms to determine what, exactly, you’re interested in. Every time you log into your Facebook account – and the same is true for its scrappier rival, Twitter – the all-knowing machine records your clicks and scrolls and develops a finer, fuller picture of the kind of person you are and what arrests your attention.
Facebook and its cousins have taken the human touch out of the choosing business, and thereby the possibility of showing you something you don’t want to see.
Or, that’s the claim. Recently, Facebook has come under scrutiny for alleged political bias in its “trending topics” sidebar. Former Facebook employees have told journalists that they were encouraged to promote certain news items and suppress others, despite what the algorithm was telling them to do. The company has denied this but has agreed to re-examine its policy of allowing manual adjustments to its algorithmic process, which is really just a fancy way of saying it’s going to get the human beings out of the business entirely.
The trouble with a purely algorithmic system that eliminates the need for the human touch, though, is that it ignores the human being on the other side of the exchange. If human choosers are unreliable and flawed – they order up the wrong television series sometimes, make unpopular films occasionally, and meddle with your Facebook and Twitter feeds out of political or cultural bias – human audiences are equally capricious.
How many times have you found yourself clicking through a website or article you didn’t even know you were interested in? How many times have you discovered an interesting movie or television show just by clicking through the channels on your television? If you’re like me, the answer is: all the time.
That kind of serendipitous discovery is random and unplanned. And because it’s unplanned, it’s unpredictable, by a person or by an algorithm.
Silicon Valley investors have poured money into machines and software designed to replace the human chooser, but they haven’t been able to replace the human clicking through Facebook. The brigade of television executives in New York City this week presenting their choices to advertisers can’t possibly know whether the shows they’ve chosen will be more successful than the ones they haven’t, any more than the French aristocrat could know for certain that the infant he chose to keep was going to grow up smart and successful.
My intuition tells me that the other two he didn’t choose probably went on to great things. The world seems to work that way.
Rob Long is a writer and producer in Hollywood
On Twitter: @rcbl

