Bill Buchanan - Test-of-Time (ToT) for Research Papers: Some Papers Rocket, Some Papers Crash, and But Most Never Go Anywhere
Description
In research, the publishing of high-quality papers is often critical for the development of a research career:
“I am an academic. It’s publish or perish.” Daniel J Bernstien.
But often we measure the work in terms of quality rather than quantity. One high-quality research paper is probably worth more than the millions of papers published in predatory journals. A great researcher should be able to measure the quality of their work by the known impact and contribution of their research papers, and not by citation count or journal impact factor. In fact, review papers often contribute little to the development of new methods, but are some of the most highly cited papers.
A research paper thus has a life. Authors might have a dream that their work is going to fundamentally change a given field, but it ends up never being read much and withers. Overall, most papers just bob along with a few citations in a year, and where you are lucky if you get more than 10 citations. An academic often follow the impact of their papers on Google Scholar, and which can give you an idea of whether their work is rising or on the wain. If you are interested, here’s mine showing a nice exponential rise over the past few years:
Some papers might rocket with many initial citations, and where researchers cite them heavily, but then either the research area just dies off with a lack of interest, or problems are found with it. Isogenies within post-quantum methods is one example of this, and where a single crack on SIDH (Supersinglar Isogeny Diffie-Hellman) stopped some of the advancements in the field [here]:
Up to that point, isogenies were the poster child and the great hope for competing with lattice methods. While they were still slow, researchers were gearing up their research to address many of their performance weakneses. They were much loved, as they used elliptic curves, but one paper stalled the isogeny steam train. I do believe they will return strong, but it will take a while to recover from such a serious crack. Cryptography is often about reputation, and a single crack can bring the whole method down.
Other papers, though, can be slow burners. The core papers in ECC (Elliptic Curve Cryptography), for example, did not take off for a few years after the work was published. When Neal Koblitz published his paper on “Elliptic curve cryptosystems” in 1987, it was hardly cited, and few people picked up the potential to replace RSA signatures. In 1997 (10 years after the publication of the paper), it is still only achieved 41 citations. But things really took off around 2005, and especially when Satoshi Nakamoto adopted ECC for Bitcoin around 2009. It now sits at nearly 400 citations per year, and where ECDSA and EdDSA have made a significant impact in replacing our cumbersome RSA methods:
Test-of-Time (ToT) Award Now Chris Peikert, Brent Waters, and Vinod Vaikuntanathan (Via-kun-tan-athan) have been awarded the International Association for Cryptologic Research (IACR) Test-of-Time (ToT) Award for a paper entitled “A Framework for Efficient and Composable Oblivious Transfer” and presented at the Crypto 2008 conference [here][1]:
Overall, the Test-of-Time Awards is awarded to papers published over 15 years ago, with the three IACR general conferences (Eurocrypt, Crypto and Asiacrypt).
The developed framework integrates “universal composability” and which provides strong security properties. Basically, a protocol P1 is secure if another protocol (P2) emulates P1, and where it is not possible to tell the two apart. It introduced a simple method of “dual-mode cryptosystem”.
The work has been fundamental in creating Oblivious Transfer protocols, and which are used in Multi-Party Computation (MPC). A great advancement of the paper is in the usage of Learning with Errors (LWE) — and which is now used within lattice cryptography methods. The paper has since laid a foundation for lattice crypt
Alfred Menezes is a Professor at the University of Waterloo in Ontario. In 2001, he won the Hall Medal from the Institute of Combinatorics and its Applications. Alfred is the lead author of the Handbook of Applied Cryptography, and which has been cited over 25,000 times. He has published many...
Published 11/23/24
This seminar series runs for students on the Network Security and Cryptography module, but invites guests to participate. Bruce has created a wide range of cryptographic methods including Skein (hash function), Helix (stream cipher), Fortuna (random number generator), and...
Published 11/21/24