October 5, 2011
Most politicians are slaves to polls – particularly politicians running for office. But polls inherently measure what's happened in the past. Sooner or later, a forward-thinking politician is going to figure out a way to ditch polls and get out ahead of everybody by using predictive technology – what we call "two-second advantage" technology.
Political polling is certainly a sophisticated business. But no matter how good it gets, it's still relying on questions asked of people days or weeks before.
There are already more instantaneous options – real-time information streams that could help determine how candidates are faring. One big one: search queries on Google. Data jockeys have figured out they can pretty accurately predict voting on American Idol based on search traffic. Google.org, the company's philanthropic arm, has begun monitoring searches for flu-like symptoms in specific regions to anticipate an epidemic – and possibly stave off a pandemic. Search traffic could be a strong real-time indicator of the popularity of a political candidate or policy.
By itself search traffic is probably not enough to give politicians accurate guidance. But there are other data streams that could be tapped: hits on the candidates' web sites, media mentions, money raised as reported to the Federal Election Commission. If all of that could be captured in one place and fed through an efficient bit of brainlike software, we could anticipate who would win the election if it were held tomorrow – not who people said they'd vote for in a poll that's already more than a few news cycles out of date.
Nate Silver, a geek who already helped advance the revolution in how baseball uses statistics to evaluate player performance, turned his attention to polls in the last presidential cycle with a wildly popular site, www.fivethirtyeight.com. The site is now part of the New York Times' online network.
Silver doesn't run any polls himself. He just takes the polls everyone else is doing, assigns them weight based in part on how accurate they've proven in the past, pulls it all together, and makes better predictions than anyone else on what direction a race is heading.
In some ways, it's a computerized version of the brain of Boston Mayor Thomas Menino, who we talked to in our book. Menino has led the Hub since 1993, and he attributes his longevity to his ability to meet as many people as possible and build a model of his city in his head – not to any modern technocratic crunching of data. "I never do polls or pay attention to them," Menino told us. "They're a snapshot of what people thought three days ago, at best. I don't use consultants."
It's reminiscent of a famous but fictional longtime Boston mayor, Frank Skeffington, the hero of Edwin O'Connor's classic political novel "The Last Hurrah." He didn't need polls on a fateful election night; while his supporters celebrated his certain victory, he saw in the results of a few precincts exactly which way the wind was blowing:
"For now the signs were there for him to read; as he watched the new figures chalked in upon the board, he read their meaning clearly, and once again, in advance of all others in the crowded room, he sensed misfortune – this time, not its possibility, but its confirmation. And while the clamor rose on newfound hope…. Skeffington stood erect, impassive, perfectly still: the unique possessor of the knowledge that he was a beaten man."
Technology is on the brink of gaining the ability to become as predictive as a Menino or Skeffington. It will be interesting to watch for a politician who's willing to think differently, and embrace just a little bit of the right information just ahead of when it's needed, instead of culling through data about the past.
September 26, 2011
This just in: The Two-Second Advantage is No. 61 on USA Today's list of top 100 bestselling books.
September 26, 2011
Out in September 2011, the book hit NUMBER ONE on Amazon.com its first day. Published by Crown Business, The Two-Second Advantage is a journey through the intersection of brain science and computer science. It delves into the science behind talent, and how discoveries about human talent are being used to develop the next generation of computers.
September 25, 2011
The Two-Second Advantage on the New York Times bestseller list.
September 16, 2011
September 16, 2011
Venture capital powerhouse Andreessen Horowitz hosted a book launch event at their offices earlier this week. Some of Silicon Valley's stars stopped by, including Scott McNealy, Kim Polese, Jeff Hawkins, Chad Hurley, and a host of others.
The books, waiting to be signed.
Vivek and Kevin during a brief Q&A.
Kevin, at the signing table.
Vivek autographing a book.
Vivek and Kevin talking to Jeff Hawkins.
September 6, 2011
Watch Now: The Two-Second Advantage Book Trailer
September 6, 2011
Can Technology Help Businesses To Get Ahead of the Game?
by Vivek Ranadivé and Kevin Maney
Featured by Sramana Mitra.com
September 2, 2011
Republican Presidential Candidate Ron Paul says the U.S. should abolish the Federal Reserve. But what really needs to be abolished is the antiquated way the Fed currently works – replacing its methods with what we call "two-second advantage" technology.
Leaving political arguments aside, what's clear is that the Fed technologically operates like a twentieth-century entity. It relies on massive amounts of data served up in batches, informing Fed governors about economic conditions in the near-past, but not the present or near-future. The Fed's decision-making is slow and reactive, like turning the wheel of a car after it hits a pothole.
Briefly put, the Fed's job is to ensure a suitable temperature in the economy so that inflation and employment stay within an acceptable range. (We want some inflation but not too much, and some unemployment but not too much.) How does the Fed do that? Mostly, it meets eight times a year to decide whether to increase or lower interest rates. How does the Fed get input about the state of the economy so the governors can make such critical decisions about interest rates? Six weeks before each Fed meeting, the regional Fed banks survey a range of business people and bankers in their regions. They also pull together numbers about economic conditions in the area. By the time the information gets to the Fed, it's old news.
So, then, how might the Fed work better in an era of two-second advantage technology? Start with the algorithmic trading systems you find in the financial industry. Those systems continuously watch real-time data streams from a number of sources – prices of stocks, bonds, commodities, currencies, but also news from sources such as Bloomberg and Reuters. They run that data through models that make predictions about where certain investments are heading, and they take action in the form of trades, all without human intervention. If private systems can watch those data streams and generate instant predictions about where things are heading, the Fed can, too. In fact, the Fed could be given the rights to tap into the different private algorithmic trading systems – so the Fed sees what they're seeing and what they're doing.
The Fed could, in addition, have access to data streams that Wall Street systems don't see. Sam's Club tracks purchases of individual members and is able to predict with great accuracy what they're going to buy next in a certain window of time. The Fed could tap into that, real-time, and always have a picture of what consumers are about to buy – instead of always looking at what they bought weeks or months ago. Xcel Energy and other utilities are increasingly putting in smart systems that take real-time readings and predict energy use, which can be an economic indicator. Federal Express relies on a sophisticated real-time system that always knows how many packages are moving to which cities, which could provide a constant barometer of regional economies. As the next decade unfolds, much of the economy will be operating on real-time and predictive systems.
The Fed could watch all of it – constantly. It could see the U.S. economy the way Wayne Gretzky saw a hockey game in progress: thousands of factors all instantly captured and processed and understood. The Fed's systems could see patterns and make predictions. But, importantly, if the Fed is going to govern the economy the way Gretzky played hockey, the Fed can't meet eight times a year and try to predict what's going to happen to the economy over the coming months. Instead, the Federal Reserve would have to hand over the power to adjust rates to its predictive system, which would adjust rates all the time. Instead of trying to guide the economy with long, sweeping actions based on long-term guesses about economic conditions, the Fed would steer the economy by constantly making very short term, highly accurate predictions about economic conditions, and adjusting rates on the fly by as little one one-hundredth of a percentage point.
This doesn't mean completely turning over economic policy to machines. It means, in fact, that Fed governors would be freed to concentrate on overall economic policy and leave the dirty business of rate adjusting to the machines. The governors would decide the rules the machines would play by. How much inflation do we want? What kind of housing market? How much speculation in the markets? And, certainly, the governors could override the machines, pulling the plug if something went awry or taking extreme action in an emergency.
The system would be transparent. No more reading tea leaves to guess whether the federal funds rate was going to head up or down at the next Fed meeting. Rates would constantly change, and anyone should be able to see those changes and follow them on a computer screen, just as investors can follow the Dow Jones Industrial Average or the yield on Treasury bills.
The Fed could collect and see real-time data from a variety of sources using technology available today. Writing the algorithms and building the model that could tell such a system how to see the whole economy on its own, make predictions, and take action would be difficult and take time. But if math jockeys can do that kind of thing for hedge funds, they could do a more complex version for the Federal Reserve.
Building a system that could truly watch the economy and learn what it does, and from that make ever more accurate predictions – this is the stuff still going on in labs. But it's not the stuff of dreams. It's stuff that's going to be real in a decade or so.
The biggest barrier of all would probably be the very idea of ceding to machines what appears to be the Fed's main function. But monkeying with interest rates eight times a year is not the Fed's main function. As the Federal Reserve Act said, the Fed is there "to furnish an elastic currency, to afford means of rediscounting commercial paper, to establish a more effective supervision of banking in the United States, and for other purposes." Interest rate adjustments are a tool – albeit an important one – that the Fed can use to do its real job of making sure the economy functions smoothly and effectively. Machines that constantly read the economy and adjust rates will be able to automate that tool and wield it more efficiently. And machines will be able to keep up with the machines already deployed by financial firms. Hopefully the Fed's machines would then have a better chance of preventing Wall Street's machines from leading us into another boom and bust cycle.
August 31, 2011
In CIO magazine's Sept 1 issue, The Two-Second Advantage is called out as one of the books "we're reading." As the magazine writes:
"It will come as no surprise to you that data, when properly processed, provides a huge competitive advantage. This book breaks down how people and technology process information to figure out what will be the exact right thing to do a second from now and execute in time to take full advantage of the moment."
August 22, 2011
When the universe puts together a series of fortunate circumstances, it's probably a good idea to pay attention. Just such a set of circumstances led to this book. Kevin Maney grew up in upstate New York and has played hockey since he was young. (He still plays and has all his own teeth.) He's almost exactly the same age and size as Wayne Gretzky. Somehow or other, Gretzky became the greatest player in hockey history, and Kevin…didn't. But for years now, Kevin has been fascinated with Gretzky's most pronounced talent: his ability to know what was going to happen on the ice a second or two before anyone else.
In the mid-2000s, Palm cofounder Jeff Hawkins introduced Kevin to theories about how the brain works as a predictive machine, which led Kevin to explore the source of Wayne Gretzky's success in hockey. He'd begun sketching out ideas that had to do with predictiveness, talent, and Gretzky's brain for a new book he was contemplating.
Around the same time, Vivek Ranadivé, the CEO of TIBCO Software Inc., noticed the arrival of a next iteration of technology that combined real-time computing, context, in-memory software, complex event processing, and analytics. Information about events happening in the moment could be correlated with historical data using software to predict future patterns. The result? New capabilities that could anticipate what was about to take place and act with precision before that moment arrives. TIBCO makes software technology that can do this stuff.
Around TIBCO, Vivek started talking about his ideas. He felt that putting them into a book would help his employees understand his thinking and get technology and business people talking about the immense possibilities of this new capability. Vivek and Kevin met in late 2009 in a TIBCO conference room, where Vivek told Kevin what he was seeing in terms of advances in technology. Kevin told Vivek about predictiveness and the human brain. Vivek realized that Kevin's ideas about the abilities of Wayne Gretzky's brain sounded a lot like the systems he believed companies had to implement to be competitive in the twenty-first century. And Kevin was intrigued that computer scientists were on a path to help explain talents such as those possessed by Gretzky. They realized they had come across the right idea at the right time.
Technology is reaching a breaking point, with too much data overwhelming computing's capabilities, and a new model of information technology is needed. The predictive nature of the brain is an expanding area of scientific discovery. And the intersection of the two – computer science and neuroscience – is an increasingly hot field that's likely to give birth to the next generation of information technology. On their own, neither Vivek nor Kevin would've seen the connections between technology and the brain so clearly. Together, they found the fields created a perfect synergy for the 2010s.
This book is the result of their collaboration.