Announcement

Collapse
No announcement yet.

Cleaver, or Wasteful Programming?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    lateralis , have you tried to turn off animations? Maybe that will cool your computer down. When those workers stop sweating so will wou

    nbarclay , thinking back, I remember that I had event-driven programming at school in 94-95. But I also recall using interrupts to avoid polling at least five years ahead of that(with my good ol' cbm64). BTW, interesting test
    Don't eat the yellow snow.

    Comment


    • #17
      I hadn't really thought that much about the effect on notebooks running off a battery, but that does provide another very good reason for programmers not to waste CPU cycles. The better notebook CPUs get at throttling back to provide only the processing power (and associated energy expenditure) needed at any given time, the more wasteful it is for programs to use every CPU cycle available whether they truly need it or not. Within a year or two, notebooks with ten times Civ 3's official minimum processing requirement (a 300MHz Pentium II) should be available!

      Another issue that occurs to me is, what happens if someone with a relatively powerful machine wants to play Civ 3 in the foreground while doing something time-consuming and CPU-intensive (like re-encoding a long video clip) in the background? In theory, every cycle not truly needed by Civ 3 should be available for the other program. But in practice, the fact that Civ 3 is in the foreground and wants every cycle the OS is willing to give it can get nasty. (I did another Dhrystone run with Civ 3 idling in the foreground and Dhrystone in the background, and Dhrystone took a little over four times as long as it did running by itself.) Granted, not many machines could do that sort of multitasking very well when Civ 3 first came out even if Civ 3 were friendlier toward it. But if past history is any indication, by the time Civ 3's life cycle ends, cheap entry-level computers will be more powerful than today's newest, most powerful machines.

      Nathan

      Comment


      • #18
        Originally posted by bongo
        nbarclay , thinking back, I remember that I had event-driven programming at school in 94-95. But I also recall using interrupts to avoid polling at least five years ahead of that(with my good ol' cbm64). BTW, interesting test
        I agree that the polling vs. interrupts issue provided some background in that sort of efficiency concern, but it wasn't a "first two weeks" type of thing and it wasn't necessarily a type of thing most students got much chance to deal with in practice. Also, even though you used interrupts to avoid polling on that C64, I wouldn't be shocked if you wasted CPU cycles right and left using delay loops to control timing. (I don't remember whether the C64 had a sleep routine with sub-second granularity or not.)

        I do have interesting memories of an assembly language class where we used delay loops on Apple IIs to control how quickly the system made its soft little clicking sound. That's how you made musical tones on those machines: by making them click at a given frequency. And it's indicative of the philosophy of programming on personal computers at the time: your program owns the computer, so if you don't need the CPU for something else, it's just fine to sit and spin.

        Nathan

        Comment


        • #19
          If you want to do other work in the background while playing civ3 you need a multi-CPU system. Multitasking were possible under win3.x but it improved much with win9x(including ME). The biggest improvement though came with NT (including w2k and xp). It's still possible for a program to use 100% of CPU power and in effect cripple the computer. At least make it a single-tasking comuper. If you have a multi-processor system that program will use 100% of _one_ cpu, the rest will be available for other tasks.
          Don't eat the yellow snow.

          Comment

          Working...
          X