From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=ham autolearn_force=no version=3.4.4 X-Google-Language: ENGLISH,ASCII-7-bit X-Google-Thread: 107f24,582dff0b3f065a52 X-Google-Attributes: gid107f24,public X-Google-Thread: 109fba,582dff0b3f065a52 X-Google-Attributes: gid109fba,public X-Google-Thread: 103376,bc1361a952ec75ca X-Google-Attributes: gid103376,public X-Google-Thread: 1014db,582dff0b3f065a52 X-Google-Attributes: gid1014db,public X-Google-ArrivalTime: 2001-08-01 15:44:40 PST Path: archiver1.google.com!newsfeed.google.com!newsfeed.stanford.edu!news.ems.psu.edu!not-for-mail From: cross@augusta.math.psu.edu (Dan Cross) Newsgroups: comp.lang.ada,comp.lang.c,comp.lang.c++,comp.lang.functional Subject: Re: How Ada could have prevented the Red Code distributed denial of service attack. Date: 1 Aug 2001 18:44:39 -0400 Organization: Mememememememmeme Message-ID: <9ka0on$me1@augusta.math.psu.edu> References: <%CX97.14134$ar1.47393@www.newsranger.com> NNTP-Posting-Host: augusta.math.psu.edu X-Trace: boatanchor.ems.psu.edu 996705880 25301 146.186.132.2 (1 Aug 2001 22:44:40 GMT) X-Complaints-To: security@psu.edu NNTP-Posting-Date: 1 Aug 2001 22:44:40 GMT Xref: archiver1.google.com comp.lang.ada:10999 comp.lang.c:71448 comp.lang.c++:79186 comp.lang.functional:7119 Date: 2001-08-01T22:44:40+00:00 List-Id: In article , Mike Smith wrote: >Yes, I do. However, what I also understand is that buffer overflow problems >are a *bug*, not a "feature", and they are a bug in the *application code*, >not the language. Only improperly written C code can contain buffer >overflow problems, and there is absolutely *no* excuse for finding them in >C++ code, because the STL can be used to eliminate them completely. Well, the same could be said of assembly language programming, but do we program major software systems in assembler? And of course it's tautological that only erroneous programs have bugs. However, just because we can use a tool effectively doesn't mean that's the tool to use. I could use a chainsaw to cut my butter in the morning when I put it on my bagel, but it's a lot safer and easier to use a butter knife instead. A similar analogy can be drawn with software; just because someone can write correct C code on a large software system doesn't mean they should. I mean, why should they? If it's easier and less error-prone to use, say, Ada (or Eiffel, or ...) why not make use of those languages instead? One reason might be, ``well, I like C.'' Hey, that's great; I like C too (heavens forbid! :-); it's absolutely great as a low-level systems programming language. But I don't like using it for application programming, because I think it lacks a lot of useful higher-level concepts that make application programming easier (ie, a built in first class string type, built in ADT's, true strong typing, etc). It's a pain to implement those again and again, but it also makes no sense to add them to C since other languages already have those facilities. I don't need or want C to have all those things, because then it becomes less effective for systems programming. People tend to forget that a programming language is a tool, and no single tool is appropriate for every job (you don't put in a screw with a hammer, do you? You do? Watch out for your house falling over, then. :-). Programmer's in particular tend to develop an almost religious devotion to their tools, so we end up with major software applications written in C when they could be written much more profitably in something like Ada (or ML, Lisp, Prolog, Eiffel, Python or whatever). So, in summary, an old cliche is apropos: pick the best tool for the job. Often, when analyzing what one is doing, one will find that C isn't that tool. Sometimes it is, but it depends on the context. As specifically regards applications programming, it's been my experience that C isn't usually the best choice. Those who believe that moving to other languages won't eliminate buffer overflows are probably correct, but I submit that they would be greatly diminished, and the empirical evidence backs me up. - Dan C.