Flutterby™! : Pythonic Expressionism

Next unread comment / Catchup all unread comments User Account Info | Logout | XML/Pilot/etc versions | Long version (with comments) | Weblog archives | Site Map | | Browse Topics

Pythonic Expressionism

2012-02-14 19:27:59.674575+00 by ebradway 5 comments

Initially, I thought of just sending an email to SPL, but thought I'd throw it put it here to get wider input.

I've been writing a lot of Python code lately. One thing that amazes me is how I am able to bang on code for an extended period of time, adding tens (if not hundreds) of lines of code, changing multiple modules, and yet, the program does what I want it to do surprisingly often!

In contrast, when I used to write C/C++ for a living, I would typically spend more time in the debugger than the code editor. And that was only AFTER the code would actually compile. Similarly, VisualBasic would let me make lots of changes without compile-time errors but often did not do what I intended. With Perl, it seemed that once you knew the complex, magic incantation, you could do amazing things but until you got the spell just right, the results were amazingly horrible.

What is it about Python that reduces the "impedance mismatch" between my intent and the resulting code? This is confounded by the fact that I am constantly having to remind myself of simple language constructs that I know cold in C/C++, VB and Perl. Even though I constantly have a browser window pointed to some version of the Python docs, I am considerably more productive as a programmer.

[ related topics: Perl Software Engineering Work, productivity and environment Python hubris ]

comments in ascending chronological order (reverse):

#Comment Re: made: 2012-02-14 19:58:03.618431+00 by: Dan Lyke [edit history]

I have that experience in Perl. My experience in Python often actually leaves me wondering why I'm not just doing this in C++, but I also suspect my C++ went off in a different direction than yours.

However, I was recently trying to understand an application framework, ended up in the Perl debugger, and realized it was the first time in years and years that I'd been in a debugger rather than using "printf" (or the appropriate language analogue) for better code understanding.

One of the things that gelled as I stepped through the call stack and hierarchy is that there's a particular style of OO programming that leads to needing a debugger to figure stuff out, and I'm gonna throw a couple of ideas out there and see what sticks.

Way back in the '90s, I was part of a team working on a C++ 3d game framework that used inheritance to abstract machine differences out. It worked on SGI OpenGL (may have even worked on GL), directly to a framebuffer on a couple of different platforms, Voodoo 2 hardware (remember that?), and at some point the 3 of us decided that we'd made a mistake and should have kept shared code entirely out of that hierarchy.

One of the things that Python taught me is that "is a" and "does" are completely different things, and "does" is the more important set of behaviors. In fact, Python doesn't care about "does" until run-time: have an object? Invoke the method on it, see if it fails.

I think that that fluidity encourages a clarity of thinking that leads to small module blocks that do specific things, that lets you concentrate on the commonalities of implementation at coding time rather than at design time, and that makes control flow much more obvious. It keeps away from "I'm in this code and don't know which of my ancestors that method is getting dispatched to" and leads towards "this object will either do the right thing or give me a stack trace".

It lets us repurpose code on the fly, operations no longer care what types they're passed, only that the types respond to the things they're asked to do.

And closures and code being just another object is an obvious continuation of that: The way Python encourages lambda operations changed not only how I write Perl, but also how I write C and C++.

Maybe it boils down to something as simple as getting past the cognitive load of planning app structure. Who cares what the interface is? It'll be defined by the operations we apply to it. The type of that variable is defined by the data I put in it. Named parameters mean extending function behavior is easy.

All of this points to a notion that we do better if we can put off worrying about the details until we're immersed in that part of understanding the process, and if we can pick a random part of the project and start pulling string rather than trying to understand the entire knot before we dive in.

#Comment Re: made: 2012-02-14 20:12:07.861855+00 by: Dan Lyke

Oh, and it goes back to another gripe I had that we probably discussed in the ver early '90s: back then we were using comparisons between software development and building a bridge. Writing software isn't like building a bridge, it's like designing a bridge. So much of the holdover attitude from when "submitting a job" and getting the output was expensive, that a lot of "software engineering" is based on, assumes that the code is the end result.

The code is the process. The code is the sheets of paper in which ideas are tried out. Software environments should encourage "what-if?"s and temporary scaffolding, but the notion is that the process should tell you if something doesn't work, not make you figure that out beforehand.

I think that's what Python is so good at: You wanna try that method on this object? Sure! Let's see if it works! No? Okay, it's easy to figure out where in the code how to make that work.

Rather than making you think all of that stuff through before-hand, and yet still pushing a lot of the cognitive load back over to you (ie: C++ constructors and exceptions and the like).

#Comment Re: made: 2012-02-14 20:45:30.889284+00 by: Dan Lyke

Oh, and realization for Perl-ness: Context awareness, like operator or return-type overloading, makes the "easy to figure out where in the code how to make that work" part harder. I've been working with Perl's SOAP::WSDL objects, and if you print out a non-terminal type, it'll give you the containing XML. What I usually really want when I guess wrong is the type so that I can figure out what to perldoc and change. Giving me the actual ref, rather than being "smart" and overloading the string return value to print out the contents, would be far preferable.

This, I think, is something Python gets right. You don't have the same potential for error with list vs scalar vs whatever context in Perl. You don't have file handles shifted left by string values as in C++. You don't have nothing subscripted by a series of tokens, as in Objective-C. The language is the language, a fairly simple set of constructs, not some mutable thing that may be different depending on who's writing it.

Although I haven't looked at it deeply, I think that core simplicity is a big part of what makes Lua popular, too.

#Comment Re: made: 2012-02-15 01:04:44.321594+00 by: spc476

I could never get into Perl and the times I did, I retreated right back to C where it was easier on so many levels (no real need for regex to manipulate strings; real structures and not that bizarre scalar/reference crap). Yes, I've come across some rather difficult bugs (I learned the hard way that signal handlers and multithreaded programming are identical) but overall, I found C much nicer.

At work, our lead developer has a thing for abstraction—he loves to hide the implementation details from itself (really! Everything is abstracted out) yet he goes on these odd micro-optimizations that are just really odd. Yes, we're developing for Android, but we're not working on a 1 MHz 6502 with 4K. And he's doing everything possible to make it impossible to use Eclipse (I just want to use it to view the object hierarchy, but noooooo! He's got to push Java code through cpp (no, really!) and okay, I'll stop here before I really start ranting). I'm beginning to really appreciate straightforward code that isn't C++ or Java.

I like Lua and use it quite a bit these days. The only thing I hate about Lua (and about dynamic languages in general) is that typos aren't found until runtime and in my case, having embedded Lua in an application, that the error reporting is sometimes lacking (this is especially problematic for testing on an SS7 stack, as in telephony, a service that goes up and down (like, you know, testing!) causes nearby SS7 service points to remove your endpoint from the rest of the SS7 network because it's flapping too much (our stuff is around the TCP/SMTP layers of IP, so imagine a router that just drops a route to your computer because your SMTP server goes up and down—grrrrrrrr; okay, I'll stop ranting again)).

#Comment Re: made: 2012-02-21 03:12:05.429057+00 by: ebradway

Dan: I could never let myself really go wild in Perl. I spent far too much time working on other people's Perl code to do that to another programmer. Python is very easy to understand and modify even if you don't know the language well. Perl is hard to understand and difficult to modify, especially if you know the language well.

spc: You pretty have to have pylint set up to write Python well. And I generally don't need the level of control C provides. If I do, there are many modules for Perl, like Numpy, that fix the common problems without sacrificing the higher level of expression.