Comments: |
Also, there is A Prettier Printer" by Wadler. I implemented an SML version of it (I think this is the one I based it on), which is rather an amusing exercise because the original takes (slight) advantage of the lazy, memoizing nature of Haskell.
This was for my awful attempt at implementing LTT.
This paper is, as you might imagine coming from Wadler, totally awesome, so you should read it.
Yeah, it is pretty sweet. Thanks for the pointer.
Also I'm not sure who "Normal Ramsey" is. :P
Man, I'm totally going to let you proof-read my next paper, whatever it is. I am clearly a bit too eager in reducing things (be they terms or authors) to normal form lately :P
What, pray tell, is wrong with printf?
(ok, that bait was too obvious -- but for basic types, it does a really nice job -- much better than any other system I've seen)
1. It's not type safe, because you can't check that the argument list and the format string match. So code like
int main(int argc, char **argv) {
char *fmt = "This is a test of the emergency %f system\n";
printf(fmt, 17);
return 0;
}
will pass most compilers without any warnings or errors, even with all the warnings turned on. 2. It only works for basic types, and if you want to use data abstraction and all that good stuff it's nice to have an API to let you add new format specifiers for new data types. Danvy's paper addresses both problems. With a little help from the compiler for the literal syntax, you could do something much, much nicer than printf.
Just because I keep hearing this excuse, (1) is no longer true. But (2) is as fatal a problem as ever.
Re 1: I tried it with gcc -Wall and got no warnings. Is there a "no, really, I want ALL the warnings" setting I can turn on?
-Wformat to get printf warnings; -Wextra to get a bunch of useful warnings (known as -W in 3.2 and before)
freaky; that particular one doesn't fail.
oh, wait, you have a string you're passing in to printf? Who the hell does *that*?!?
Well, it doesn't give any warnings, so it must be considered legitimate, right? ;)
This is kind of the whole crux of why printf is considered dangerous. If the compiler only allowed string literals there as a "mini-language" of format strings that (a) are not first-class things that could be passed around and constructed at run-time and (b) induce, if you're going to be honest about things, a dependent typing problem to be solved, but since you only have literals, it's easy to solve; this is in fact just what gcc does with its warnings about improper printf calls with string literals.
What the Danvy paper's technique allows is the creation of "format strings" that you can pass around, and also safely eliminate against things to fill in their holes.
Whoops, I meant to say "if the compiler only allowed string literals, then at least it would be safe"
I think this gets to the crux of the disagreement between people who use C and people who do PL. I never ever ever even consider using anything but a string literal as a format string for printf-like and scanf-like functions. The language allows it, but that's meaningless because I would never type it, and neither would almost anyone else I know. And so, it does not bother me that the type-checking is imperfect: it's fine so far as the language subset I use.
Not that I'm a full-on C addict/proponent. I want lambdas dammit, and I want better (though I don't demand perfect) type checking.
Also, my ranting is done; I'm too lazy to go read up that paper (mostly because I have pressing deadlines).
Well, we fanatics don't demand perfect typechecking either.
It's pretty much a foregone conclusions that a well-typed program can do things you don't want it to, return the wrong answer, etc. It's just irritating when a language is so unsafe that it's hard to even be sure the result of running a program doesn't depend on whatever happens to be in memory today. Even as a C programmer, I would happier if the compiler either didn't allow me to mistakenly think I can get away with nonliteral strings, or else gave me lambdas to build better facilities out of.
I've done it--that is create the format string as a non-literal. Since the compiler I was using at the time did no checking at all, it really made no difference. Of course, the thing I was doing could have been done much more elegantly in C++ using the stream operators.
The problem with your argument is that some crazy programmer (such as me), or a junior programmer, or a smart programmer who makes a mistake, will end up doing one of those crazy things that the language allows but the programmer almost never wants to do. Disallowing these sorts of things is A Good Thing.
![[User Picture]](https://l-userpic.livejournal.com/74931202/1194820) | From: daev 2006-07-06 05:51 pm (UTC)
| (Link)
|
Thanks for posting this. I was doing a search for ways to convert abstract syntax trees into parsable, natural output, in a way that handles the vagaries of complex language syntax, and ran across your LJ. That Ramsey article looks like exactly what I need. (What would you suggest for determining whether any later published work has improved on it?)
No problem! I am not an expert or active researcher on the subject, so I don't have anything else off the top of my head to recommend; it was only for a side project that didn't come to much that was the reason I had been looking around for this stuff. | |