Spec || Python library

Probably one of the most interesting tools, that interested me much when I started out with Javascript was JSPerf second only to JSLint [and to a smaller extent, JSHint, thanks to an extreme affinity to Crockford’s ideals]. The tool built over the Benchmark.js library, provides a *visual* comparison of parallel implementations to a problem in terms of ops/sec. I could find some implementations of a solution to this particular issue but for some queer reason, didn’t appeal to me so sat down to write one myself, much like the joke on competing standards.

The one pertinent concern according to me at least, is some kind of reporting that speed.pypy.org offers. Graphical, for that is how you figure things out relatively easier irrespective of how much you love the shell and dumb terminal outputs.

There could be two ends to this implementation attempt; either it encompassing the possible ways of actually profiling implementations with a visually comprehensible report or me understanding the nuances of working in a Pythonic manner. A win-win in either case I guess. Created a rudimentary port stripped out to work with timeit initially, from an existing local repository that provides dis data along with profile/cProfile stats. Suggestions welcome.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s