As reported in Wired magazine, the authors of a new book, The Bestseller Code, "fed 5,000 fiction titles published over the last 30 years into computers . . . [and] used so-called machine classification algorithms to isolate the features most common in bestsellers.
"The result of their work . . . is an algorithm built to predict, with 80 percent accuracy, which novels will become mega-bestsellers."
I stopped reading at that point. It seemed obvious to me that "the features most common in bestsellers" in the past will not necessarily be the most common in bestsellers in the future. Things change.
But don't take my word for it. Cathy O'Neil (PhD in math from Harvard, professor at Barnard College, etc.) has written a book called Weapons of Math Destruction. It is about algorithms. I heard her speak at the Mechanics' Institute Library. She summed up her position by saying, "algorithms replicate the past." Apparently fans of algorithms argue the point by claiming their algorithms are so smart they can learn as they are used. But that doesn't matter because they are learning from the past.
Many of us can remember when personal computers first began landing on peoples' desks. Around that time it was common to see things like investment advice advertised as "done by computer." The assumption was that the information must therefore be more precise, more complete, more correct. I think algorithms have taken over as the new "secret sauce" that supposedly makes everything smarter.